Should You Use ChatGPT for Therapy?

Increasingly more and more people are chatting with AI to help them talk through and manage mental health struggles. But is this okay? Does it actually help? And on the flip side — does it pose any dangers to your mental health? In this episode, I’m diving deep into whether or not using ChatGPT, or any AI bot, for therapy is a wise choice, along with the potential consequences.

In this episode, you will learn:

  • The risks of using AI for therapy
  • The differences between chatting with AI and going to therapy
  • Future considerations to use AI for mental health

LISTEN NOW:

WATCH NOW:

https://youtu.be/8aKyDiBrRDk

*** Correction: In this episode, I mention “a teen who killed his mother,” but the case involved an adult, not a teen.

Sources:

Mentioned in episode:

Therapy Directories:

💬 Submit a message, question, or suggestion to the podcast: Message

FREE TRAINING: Achieve A Calm Mind, Balanced Life, & Empowered Confidence in 90 Days

If you want to learn how to take back control of your life so you can feel calmer and more confident, and learn the tools to spend your time according to what matters most to you (no matter what your schedule is like right now)

Watch the on-demand training

LISTEN, REVIEW, AND SUBSCRIBE TO THE PODCAST!

Calmly Coping Podcast

INTRO/OUTRO MUSIC: Rescue Me (Instrumental) by Aussens@iter (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/tobias_weber/57990 Ft: Copperhead

DISCLAIMER: All content here is for informational purposes only. This content does not replace the professional judgment of your own mental health provider. Please consult a licensed mental health professional for all individual questions and issues.

Interested in diving deeper to get support for high-functioning anxiety?

Calm, Balanced, & Confident is my comprehensive A→Z self-paced course to help high-achieving professionals overcome high-functioning anxiety so they can feel calmer, balanced, and more confident without the anxiety and overwhelm. Click here to learn more and enroll today.

TRANSCRIPT:

Click to view the episode transcript.

Increasingly, more and more people are chatting with AI to help them talk through and manage mental health struggles. But is this okay? Does it actually help? And on the flip side, does it pose any dangers to your mental health? In this episode, I’m diving deep into whether or not using chat GPT or any AI bot for therapy is a wise choice along with the potential consequences.

Welcome to Calmly Coping. I’m Tati Garcia, a licensed therapist and coach here to help high achievers stop overthinking and finally feel calm and confident from within. If that’s what you need, then hit subscribe. Let’s dive into the episode. AI can be highly validating and its responses can appear.

Supporting and empathic, I say appear because can programmed code. Actually be supportive and empathize with you, and this can feel great, but there are risks that can come with this. Let’s talk about them. First of all, it can increase anxiety and decrease self-trust because when you go to AI to provide you with reassurance and validation about something you’re worried about, especially when you struggle with anxiety and or OCD, obsessive compulsive disorder, this behavior is reinforcing externalizing validation and it worsens your intolerance of uncertainty.

What this means is that you teach your brain that you need to rely on AI or some other source to calm your anxiety, to give you a sense of certainty, to reassure you that no, you don’t have this serious medical condition. Or, yes, this email you are sending sounds. Perfect in and of themselves. Those things aren’t bad things to ask.

However, when you rely on this, you are decreasing your ability to be with uncertainty, which is a necessary part of life, and this, as a result, makes your anxiety increase in the long run and makes it feel even more uncomfortable and intolerable. That’s because you are teaching yourself that I can’t be with uncertainty.

I need something to gimme a sense of reassurance, to gimme some form of comfort that everything’s gonna be okay. When in reality that’s making you even more intolerant of that feeling, making it seem like something you have to avoid at all costs, and thus making it more challenging for you to sit with when actually the more we face and sit with our anxiety and our discomfort and our uncertainty.

The more that we actually can be with those feelings and the more they decrease over time. The same thing has happened with smartphones and having the internet at our fingertips. You can see my episode on that topic here, clicking on the corner or on the link in the description. But with ai, it’s essentially just supercharging.

Those consequences. And along with the self-trust, it can decrease your ability to make decisions because AI loves to help you plan, make lists, plan out timelines. And this can all be super valuable, especially if you’re somebody who’s type A, and I could say as somebody who has used AI many times for these reasons.

However, if you’re consistently using it as this tool and not using your own muscle of decision making. To think of solutions to listen to what you want for yourself. This can make that muscle atrophy or become weaker over time, thus making it harder for you to make even the simplest decisions. And if you’re somebody who already struggles with making decisions, as many of my clients do, and there’s nothing wrong with this, know that you’re not alone.

This often tends to be because they’re overthinkers, tend to struggle with self-trust. Then outsourcing your decision making to AI is just further reinforcing this lack of self-trust, deteriorating your confidence in your ability to make your own decisions. There’s a reason that therapists, at least a good therapist.

Won’t tell you what to do. A good therapist will help to guide you in the direction of exploring your own needs, thoughts, and desires so that you can connect with that for yourself. AI can’t always find that nuance, nor can it hold back from giving prescriptive advice. The next risk is that using AI for therapy can potentially reinforce delusions, cognitive distortions, which are inaccurate ways of thinking, thoughts and plans to self-harm and harm others, and other unhealthy, potentially dangerous ways of thinking.

And this includes the lack of an ability for AI to effectively challenge you when you’re having inaccurate thoughts, recognize the real life consequences of what is being said. And put things into context. There have already unfortunately been examples of this happening in real life, recent high profile cases, including lawsuits involving a Florida teen who died by suicide after interacting with an AI chatbot, and another teen whose delusional AI reinforced beliefs culminated in the killing of his mother.

Have raised urgent concerns about how unchecked AI responses can worsen risk in vulnerable users. And this isn’t just about teens, this is also happening to adults. And next, there are the privacy implications. Your data is not private. This is a quote from an article. Mitchell also raises potentially troubling privacy implications, open AI reviews users conversations, and uses them for training, which might not appeal to people who want to talk about extremely personal issues, unquote.

Do you really want your inner most thoughts, fears, insecurities, and struggles on a server somewhere? Where it’s being used to train a chat bott or being read by actual humans. So we’ve talked about some of the very real risk. Now let’s talk about some of the very real limitations of AI and using it as a tool for therapy.

First responses are not always accurate. This is often something you will see at the bottom as a disclaimer saying responses aren’t always factual. I have encountered this myself many times where AI might give you a hypothetical answer to a question or give you links or dates or anything that are not factual or not real, and if this is happening in context outs.

Side of using it as a resource for therapy, then I can guarantee it’s happening in other contexts as well. Additionally, general AI bots are not trained to provide therapy. They’re trained to be agreeable, and recently research has shown that they are sycophantic, which is overly flattering or agreeable, according to an article in nature, published in October, 2025.

Titled AI Chatbots are sycophants. Researchers say it’s harming science. Quote, AI. Chatbots including chat, GPT and Gemini often cheer users on. Give them overly flattering feedback and adjust responses to echo their views sometimes at the expense of accuracy. Researchers analyzing AI behaviors say that this propensity for people pleasing, known as Sycophancy, is affecting how they use AI in scientific research.

In tasks from brainstorming ideas and generating hypotheses to reasoning and analyses, ska Fancy essentially means that the model trusts the user. To say correct things says Jasper Onic, data scientist and PhD student at the Swiss Federal Institute of Technology in Zurich. Knowing that these models are sycophantic makes me very wary Whenever I give them some problem, I always double check everything that they write.

Unquote, this can cause major issues when the individual is coming to the chatbot with inaccurate ways of thinking with the most danger coming, like I previously stated, when those statements are related to self-harm or harming others. So if this is negatively affecting research, I can only guess how it’s affecting mental health.

And additionally, according to an article in Business Week, quote, para general use chatbots including chat, GBT, Google’s, Bard, and Microsoft Corps Open AI Power to Bing Chat are based on large language models, a technology with a well-documented tendency to simply fabricate. Convincing sounding information, general use chatbots aren’t designed for therapy and haven’t been programmed to conform to the ethical and legal guidelines.

Human therapists observe, and this is another quote, that Mitchell is also concerned that people seeking therapy from chatbots could aggravate their suffering without realizing in the moment that that’s what they’re doing. Even if someone is finding the technology useful, that doesn’t mean that it’s leading them in a good direction, unquote.

So these are just some of the very real documented limitations of ai. Next is that therapy is about more than just getting advice and venting your problems. It is about the relationship you form with the therapist and how that reflects the relationships you have with others. So it could be the way you say things if you overly apologize, minimize your accomplishments.

Show certain body language or facial expressions. These are all things that are relevant to therapy that a therapist can observe and explore. Recurring patterns that could be reflective of internal beliefs or thought processes. AI just does not have the capability to do those things, at least right now.

Therapy also includes the therapist challenging you, pointing out serious concerns and taking actions to protect your safety and the safety of others, along with providing you with hard truths when necessary. Overly agreeable. AI is not capable of doing this. The next limitation is that context. Slang and other possibilities for misinterpreting your meaning.

Make AI less equipped to interpret the nuance and complexity of human interaction the same way that a lot of meaning can be lost through a text message. A lot of meaning can be lost through a chat with an AI bot. Additionally, mental health professionals typically have graduate degrees, have passed examinations to test their knowledge and have to have thousands of hours of experience be before becoming eligible for licensure.

AI has not passed this rigorous level of demonstrating its competency. Would you trust a human to give you therapy who has not met this level of education and experience? No. Then why would you trust an AI bot who has not? So here’s some future considerations for AI and using it as a tool for therapy.

So if there is an AI model that is specifically trained to provide therapy. Then this could be potentially useful in the future, especially if it is using an evidence-based approach that can be systematized. For example, C-B-T-D-B-T-E-R-P. These are all different evidence-based therapeutic approaches that are typically very systematic and have specific.

Guidelines for how they should be implemented. However, in my personal opinion, I think in general, in our world of already deepening disconnection and division, we don’t need another reason to avoid human interaction. If you have the ability to see a human therapist, whether it’s virtual or in person.

Please do so as a therapist. I know I’m biased, but I also genuinely believe in the importance of getting support and guidance when you need it. And I’ve seen firsthand the power of a supportive, non-judgmental person to help you walk through the difficulties of being human because it’s not easy and AI will never be able to understand the human experience as much as it may try to emulate it.

Of course, this is my opinion. All in all, AI is constantly evolving at a pace that ethics committees and laws. Cannot keep up with, and that is why it is all the more important to make well-informed decisions with how you are using it and using it as a tool rather than relying on it for certain functions of being human.

Whether that is getting validation, support, therapeutic help, or even to do things that. You don’t wanna lose the ability to do. And what I mean by that is when we start to rely on external sources like AI to think for us to. Brainstorm for us to write for us. Those can be helpful things to make our work more efficient and quicker.

However, when we are focusing on using it for that, then we’re going to lose the ability to think in that way. It’s gonna change the ways that our brains function the same way that. Different technologies have changed the ways that our brains function up until now, if you are interested in that topic, there’s an excellent book, the Shallows by Nicholas Carr, that talks about how technology has changed our brains, essentially.

But ultimately, there are pros and cons to everything, and I find it extremely important to be aware of the possible effects and implications of a tool before you begin to rely on it for something as incredibly important as your mental health without our mental health. What do we have? It’s so valuable and impactful, and the same way that you wouldn’t, I’m guessing, go to an AI doctor when you have a serious medical condition.

The same is true for mental health, even if you don’t consider it to be serious. There’s people that are trained in this for a reason. I’d love to hear your thoughts on the topic in the comments below. If you’re watching on YouTube or listening on Spotify, have you used AI for therapy? What do you think of using it for therapy?

And are there any therapists out there? I’d love to hear your thoughts as well. And I do want to point out something very important, that there is a reason why people are going to AI for therapy, and that is because it can be challenging to find a therapist. Whether that is because of financial reasons, whether that is because of accessibility reasons, whether there aren’t enough therapists in your area, and I do want to provide resources for seeking out therapy.

If you’re in that position, there are many directories to help you to find a therapist in your area. And with the advent of telehealth, even if you cannot find one local to you, if somebody is licensed in your state, then you can meet with them for telehealth. There are also different websites such as Open Path Collective that provide you with low cost therapy if financial concerns are a barrier for you.

And if you’re looking for a therapist and are a resident of New Jersey, Pennsylvania. Or recently I’ve become licensed in New York. Click on the link in the description to learn more about working with me. While you wait for the next episode, I have other episodes about calming your mind, improving work life balance, and feeling more confident from within.

So be sure to check out these episodes here. Thank you so much for tuning in today, and until next time, be calm.

Until next time…

Be Calm,

Tati

Leave a Reply

Your email address will not be published. Required fields are marked *

TatianaGarcia-horizontal

Hey, I'm Tati!

I believe that everybody deserves to live a calm, fulfilling life. My hope is to inspire high achievers to stop fear from running their lives and start putting their needs first.
Search
paico-oficial-NIpx9ZUSiZg-unsplash

Take the free quiz

Do you have high-functioning anxiety? Take the quiz below to find out and get personalized resources!

online course

Calm, Balanced, & Confident

The complete step-by-step program to overcoming high-functioning anxiety.

YOU MIGHT ALSO LIKE

Are you a high achiever who is feeling anxious and overwhelmed?

You might be struggling with high-functioning anxiety.

Take the free high-functioning anxiety quiz by clicking below to find out. (You’ll also get personalized tips & resources!)