Therapists Are Secretly Using ChatGPT: A Threat to Patient Trust?
As the popularity of AI-powered tools continues to grow, a new concern has emerged in the field of psychotherapy: therapists using chatbots like ChatGPT to draft responses or even analyze patient data.
A Personal Experience
I recently received an email from my therapist that seemed polished and reflective. But upon closer inspection, I noticed the unusual font and a telltale sign of AI-generated text. When I asked her about it, she confirmed that ChatGPT had been used to draft the message.
Declan's Experience
Dylan would never have discovered his therapist's secret if it weren't for a technical mishap during an online session. Declan, 31, was taken aback when he saw his therapist use ChatGPT to analyze and summarize their conversation. "I became the best patient ever," he joked, "because ChatGPT would be like, 'Well, do you consider that your way of thinking might be a little too black and white?' And I would be like, 'Huh, you know, I think my way of thinking might be too black and white,' and [my therapist would] be like, 'Exactly.'"
The Risks of AI in Therapy
Therapists are using ChatGPT to draft responses or analyze patient data, but this raises concerns about transparency and trust. "People value authenticity, particularly in psychotherapy," says Adrian Aguilera, a clinical psychologist at the University of California, Berkeley. "Using AI can feel like, 'You're not taking my relationship seriously.'"
Studies on AI and Therapy
A 2025 study published in PLOS Mental Health found that participants were unable to distinguish between human and AI responses to vignettes about therapy issues. However, when participants suspected responses were written by ChatGPT, they ranked them lower.
The Potential Benefits of AI in Therapy
AI-powered tools could help therapists better communicate with clients, but transparency is essential. "We have to be up-front and tell people, 'Hey, I'm going to use this tool for X, Y, and Z' and provide a rationale," says Aguilera.
Many therapists are wary of using LLMs in the first place, citing concerns about patient data privacy. "These tools might be valuable for learning, but we have to be super careful about patient data," says Margaret Morris, a clinical psychologist at the University of Washington.
"Sensitive information can often be inferred from seemingly nonsensitive details," warns Pardis Emami-Naeini, assistant professor of computer science at Duke University. "Identifying and rephrasing all potential sensitive data requires time and expertise, which may conflict with the intended convenience of using AI tools."
As the use of ChatGPT and other AI-powered tools in therapy grows, it's essential to consider the risks and benefits. While AI can be a valuable tool for therapists, transparency and trust are paramount.