top of page
Minimalistic Plant Decor

Can AI Replace Therapy? A Psychologist’s Thoughts on Using ChatGPT Between Sessions

Lately, I’ve had more people coming into therapy who mention using ChatGPT or other AI tools between sessions. They may use it to process thoughts, calm down, get perspective, or work through something difficult in the moment.


Because this is becoming more common, I think it’s worth having an honest conversation about how AI can be helpful, where it has limitations, and what people should consider before relying on it for mental health support.


The short answer: Yes, I think there is a place for AI as a tool that can support your therapeutic journey. But I also think there are risks, and those risks matter.


Exploring AI applications on smartphone will enjoying a cup of coffee.
Exploring AI applications on a smartphone while enjoying a cup of coffee.

Where AI in Therapy Can Be Helpful in Mental Health


In therapy—especially when using evidence-based treatments like CBT and DBT, or other structured approaches—we often assign homework between sessions.

That might include reframing negative thoughts, practicing coping skills, mood tracking, journaling, and problem-solving stressful situations. This is where AI can sometimes be helpful. For example, when you are caught in a thinking spiral, AI can help you organize your thoughts, identify thinking traps, or generate more balanced perspectives. Used appropriately, AI can sometimes reinforce the work you’re already doing in treatment.


Where AI in Therapy Has Real Limitations


The main concern with AI as a therapeutic tool is that AI can sound thoughtful, warm, and validating. But it is still a tool—not a trained clinician, not a treatment relationship, and not a substitute for professional judgment.


1. AI May Be Too Agreeable

Recent reporting on NPR highlighted research suggesting AI systems can validate users’ viewpoints more than humans might—and that this may reduce willingness to resolve conflict.


2. Flattery Can Feel Good—but Still Be Harmful

Another NPR piece discussed concerns about AI chatbot flattery and mental health risks. Systems designed to be overly pleasing may encourage dependence or distort judgment.


3. AI Cannot Read the Room

Therapy is about much more than words. A therapist notices tone shifts, avoidance, body language, and what is not being said.


4. AI Cannot Manage Crisis Risk in Real Time

If someone is suicidal, manic, psychotic, dissociated, intoxicated, or at risk of harming self or others, AI is not enough.


5. AI May Worsen Symptoms for Some People

This is especially important for people living with psychosis, bipolar disorder, OCD reassurance seeking, severe anxiety, trauma-related dissociation, or paranoia.


Should You Tell Your Therapist You’re Using AI?


Absolutely. Your therapist is not judging you. I promise. The best approach is openness. A therapist can help you explore is this helping or hurting, is this reinforcing unhealthy patterns, is it appropriate for your diagnosis, and how can you use it safely.


My Professional Takeaway


I don’t think the real question is: Can AI replace therapy? I think the better question is: How can AI be used responsibly to support mental health without replacing what actually heals people?


Healing often happens through:

• Human connection

• Feeling deeply understood

• Honest feedback

• Emotional safety

• Repairing relational wounds

• Being challenged with care

• Practicing new ways of relating in real time


That is still the heart of therapy.


If you’re using ChatGPT or other AI tools between sessions, bring it into the therapy room. Talk about it openly.


Need Real Support?


At Sage Supportive Services, we help clients navigate anxiety, trauma, life transitions, serious mental illness, and emotional growth using evidence-based therapy that fits modern life.


In-person and virtual appointments are available here: https://www.sagesupportiveservices.com/request-treatment



Citations:



 
 
 

Comments


bottom of page