Gizmodo: AI This Week: Calli Schroeder on Why You Shouldn’t Use a Chatbot for a Therapist 

September 30, 2023

In your tweets it seemed like you were saying that talking to a chatbot should not really qualify as therapy. I happen to agree with that sentiment but maybe you could clarify why you feel that way. Why is an AI chatbot probably not the best route for someone seeking mental help?   

“I see this as a real risk for a couple reasons. If you’re trying to use generative AI systems as a therapist, and sharing all this really personal and painful information with the chatbot…all of that information is going into the system and it will eventually be used as training data. So your most personal and private thoughts are being used to train this company’s data set. And it may exist in that dataset forever. You may have no way of ever asking them to delete it. Or, it may not be able to get it removed. You may not know if it’s traceable back to you. There are a lot of reasons that this whole situation is a huge risk. 

Besides that, there’s also the fact that these platforms aren’t actually therapists—they’re not even human. So, not only do they not have any duty of care to you, but they also just literally don’t care. They’re not capable of caring. They’re also not liable if they give you bad advice that ends up making things worse for your mental state.” 

Read more here.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.