Forbes: An AI App Claiming To Detect STIs From Photos Of Genitals Is A Privacy ‘Disaster’ 

March 28, 2024

But since its launch, Calmara has been met with a deluge of criticism over consent, data privacy concerns and the possibility that child sexual abuse material could wind up on Calmara’s servers, calling their credibility into question and prompting its creators to hastily backpedal aspects of their product. 

“I don’t think a solution is taking photos of someone else’s genitals and sharing them with an app that’s not even a medical provider and isn’t subject to same standards a doctor’s office would be,” said Sara Geoghegan, who serves as counsel at the Electronic Privacy Information Center and focuses on issues related to consumer privacy. 

…“The nature of consent here is impossible,” Geoghegan said. In Calmara’s case, “the person who the app is for is not the one whose genital images are being shared, so the actual subject of this invasive practice is not the one whose consent is sought after.” 

Read more here.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate