EPIC’s Response to Reports of Crisis Text Line Data Policies

January 31, 2022

The recent reporting on Crisis Text Line (CTL) has raised significant concerns about data ethics and threats to privacy posed by the transfer and use of sensitive data from the crisis response service transcripts. EPIC previously wrote in support of CTL’s work to develop and evaluate privacy enhancing techniques that can facilitate academic research by providing aggregate or scrubbed data. However, the use of private message data from sensitive crisis communications for commercial research crosses an ethical line and violates the intimate and private nature of those conversations regardless of how the data is processed and scrubbed. 

On Friday, Politico reported that Crisis Text Line (CTL), a nonprofit organization that provides mental health services via text message, has been providing data to a commercial software company, Loris.ai, as part of a partnership that includes financial support and part ownership in the company. According to CTL, the data provided to Loris.ai has been “fully scrubbed and anonymized” and the access occurs in a “secure environment” that CTL controls. 

After publication of the report on the commercial use of sensitive messages that CTL obtains from people “in their darkest moments,” the organization posted a statement. The CTL specifically cited a 2018 letter that EPIC filed in support of CTL’s application to the Federal Communications Commission to be classified as a “public safety service” eligible to obtain phone location data during emergencies. 

EPIC’s letter referred to CTL as a “model steward of personal data” in that context. Our statements in that letter were based on a discussion with CTL about their data anonymization and scrubbing policies for academic research sharing, not a technical review of their data practices. Our review was not related to, and we did not discuss with CTL, the commercial data transfer arrangement between CTL and Loris.ai. If we had, we could have raised the ethical concerns with the commercial use of intimate message data directly with the organization and their advisors. But we were not, and the reference to our letter now, out of context, is wrong.

The fundamental problem with what CTL and Loris.ai are doing is not that they haven’t used the right words or the right data techniques. The problem is that their arrangement seeks to extract commercial value out of the most sensitive, intimate, and vulnerable moments in the lives those individuals seeking mental health assistance and of the hard-working volunteer responders. And the financial relationship between CTL and Loris.ai further undermines the trust that is essential both to their core service and to the ethical academic research framework that they have worked to establish. 

As CTL themselves said on their own FAQ page back in 2016, commercial use of this data is “gross” and should be a nonstarter “(Read: no commercial use. Never ever ever.)”. That was a good standard then, and it is a good standard now. No data scrubbing technique or statement in a terms of service can resolve that ethical violation.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate