My Therapist Used ChatGPT in Sessions — Why That Matters
What actually happened
Patients have started discovering that some therapists are consulting ChatGPT during live sessions. In one striking example a therapist accidentally shared his screen during a virtual appointment and the client saw the clinician typing prompts and then repeating responses suggested by the model. Incidents like that turned private practice into a real time demonstration of how AI can sit between a patient and their caregiver.
Why disclosure matters
In the cases reported, therapists did not disclose their use of AI ahead of time. That lack of transparency is the central problem. Therapy depends on trust built over time, and learning that a clinician used a third party tool during a session can feel like a betrayal. Even if the intention was to save time or find an extra phrasing, failing to obtain informed consent introduces ethical and legal questions and can permanently damage the therapeutic relationship.
Is AI actually useful in therapy?
Researchers have tested purpose built AI tools for specific therapeutic tasks, and some early trials have shown promising results for standardized, manualized interventions such as certain forms of CBT. But general purpose models like ChatGPT were not created or clinically validated for mental health care. Many therapists who spoke with reporters are skeptical about relying on such models for case formulation or treatment planning. They point out that human supervision, clinical judgment, and peer consultation remain the recommended routes for deciding how to treat a client.
Convenience versus clinical safety
One practical driver is the tedium of administrative work. Several clinicians admitted they are tempted to use AI to draft notes or summarize sessions. That use case is distinct from asking a model how to respond in the moment or using it to generate interventions. Inputting detailed client information into an unvetted third party system can create privacy risks and may violate professional standards if sensitive data is exposed.
Regulation and professional guidance
Professional bodies already weigh in. For example the American Counseling Association advises against using AI tools to diagnose patients. Some states are moving toward legal restrictions: Nevada and Illinois recently passed laws limiting or prohibiting AI in therapeutic decision making. These moves suggest more oversight could come as policymakers try to catch up with rapid adoption.
What tech companies say and why it matters
Tech leaders have observed people turning to chat models for emotional support. Sam Altman noted that many people effectively use ChatGPT as a sort of therapist and framed that as a positive development. But equating reassuring chatbot replies with real therapy risks overpromising. Good therapy often challenges and discomforts clients as part of a deeper process. A conversational model that validates or soothes is not equivalent to a clinician who probes, interprets, and holds accountability across time.
Practical steps for patients and clinicians
Patients who discover their therapist used AI without consent should raise the issue directly and ask how the tool was used and why. Clinicians should proactively disclose any planned use of AI and get informed consent, restrict sensitive data sent to external services, and prefer clinically validated tools for treatment tasks. Policy makers and professional associations also need to define boundaries so that AI is used safely and transparently in mental health contexts.
Wider implications
This story shows how messy real world AI adoption can be when tools are used outside the contexts they were built for. There is potential for AI to augment aspects of mental health care, but secrecy, inadequate validation, and data privacy concerns make some current practices dangerous. Clear disclosure practices and regulatory guardrails will be essential if AI is to play a constructive role in therapy.
Read the original reporting by Laurie Clarke and note that this piece first appeared in The Algorithm newsletter.