It’s 3 a.m. You’re in the back of a taxi in a strange city and feeling anxious. There’s no need to worry, though, because your therapist will see you immediately.
There won’t be a couch to lie on or a clock to count the passing time. In fact, they probably won’t even bill you. That’s because the therapist in question is the same free, friendly chatbot that also serves as your homework helper, recipe creator and exercise plan designer whenever you need it.
Therapy has entered the chat, and it’s increasingly common for people, teenagers in particular, to seek mental health advice directly from AI, whether it’s ChatGPT or a custom app. A 2024 mental health study found that 28% of those surveyed were using AI as a personal therapist.
Should we be trusting AI with our innermost thoughts? Let’s examine whether tech-assisted therapy represents a brighter future for mental health care or something darker.
Why are people using AI as a therapist?
Medical professionals are already used to patients arriving for appointments with printouts and screenshots from “Doctor Google”. Self-diagnosis and personalized care are nothing new. The difference with AI is that it could remove mental health experts from the process entirely.
Yet there are several understandable reasons why people are turning to AI therapy.
More affordable
The average cost of therapy in the U.S. is $139 per hour and rising. That’s out of reach for many, whereas a ChatGPT subscription comes in at $20 per month with free versions available. For the majority of those who require mental health support but can’t access it, affordability is one of the key barriers to professional therapy.
24/7 availability
With only 1 mental health clinician per 340 people in the U.S., getting professional help in time—and for enough time too—is by no means straightforward. By contrast, the therapist on your phone always has all the time you need in their schedule.
Convenience
Whereas interacting with a faceless chatbot might have given us the ick before the pandemic, most of us are now quite comfortable with remote working, online teaching, or using automated customer support systems. There’s less of a stigma today about using tech for therapy, even if we don’t necessarily trust the advice we’re given.
Confidentiality
The golden rule of the professional therapist might be “listen and don’t judge”, but we’re still sharing private, personal information with another human. When we’re venting, ranting, or brain-dumping with an AI bot, on the other hand, there are no raised eyebrows or awkward silences to worry about.
Why you shouldn’t overshare with a chatbot
As smart as AI may seem, and as human as we can make it with avatars and cute “personalities”, it still presents some significant flaws when asked to perform the role of therapist.
Lack of regulation
Professional therapists are thoroughly trained and rigorously regulated. They must follow strict safety protocols by law. If a patient expresses a desire to harm themselves or others, for example, a qualified therapist must follow mandatory reporting procedures. In short, human therapists are invested in the process and outcome in a way that AI therapists are not.
Bad advice
A study by Stanford University found that AI gave unsafe or inappropriate advice 20% of the time, compared to 7% for human therapists. Without peer-review, there is no way for the AI to assess its own performance, and no consequences for getting things wrong.
Delusion spiral
Your AI chatbot is programmed to respond with friendly, non-confrontational, sometimes over-enthusiastic validation. That dynamic, on a loop, can induce delusion or psychosis. A professional therapist, on the other hand, will push back, challenge and maintain an objective distance.
Missed signals
The current generation of text- and voice-based AI models cannot spot the nonverbal cues and crisis signals that may indicate more serious issues. They are built to encourage rather than challenge thought patterns, however negative. It’s one of the reasons why, as one 2022 study found, online interactive therapy can actually increase the risk of self-harm.
Rob Pintwala, the founder and CEO of First Session, a Canadian therapy platform that’s held more than 30,000 human therapy sessions, explains why the human connection a qualified therapist provides is irreplaceable.
“There’s a concept in therapy called ‘holding space,’ which is uniquely human,” he explains. “A grounded therapist’s presence can support growth and healing in ways AI simply cannot.”
Sometimes, a therapist will just sit with you in silence, providing space for trust to grow and emotions to process. Not so your AI therapist, which craves a response.
“Beyond words, therapists notice body language and even what’s left unsaid, which AI doesn’t take into account,” adds Pintwala, highlighting AI’s non-verbal shortcomings.
Lack of privacy
With certain exceptions, a professional therapist is bound by patient confidentiality. Your chatbot isn’t. The CEO of OpenAI himself has warned that ChatGPT has no legal obligation to observe the privacy of user chat logs. These can be subpoenaed by law enforcement, or private data could be stolen and shared with third parties by hackers.
Emotional dependence
An in-person therapy session is time-limited and topic-focused. A chat therapy session, on the other hand, can become unstructured and open-ended. In some cases, that can push vulnerable people into a harmful cycle of dependency. When therapy is always the option, it can become the only solution to any scenario.
How the law is changing on AI therapy
Following some high-profile cases of ChatGPT or AI bots sharing dangerous medical advice, encouraging eating disorders, or in one case actually helping a vulnerable teenager to draft a suicide note, there’s greater regulation about the use of AI in therapy.
- The use of AI for mental health or “therapeutic decision-making” without licensed clinician oversight is now banned in several states.
- The era of unregulated, readily-available mental health apps is coming to an end as the Federal Trade Commission and American Psychological Association push for bans on any app posing as a regulated therapist.
- Lawmakers are also pushing for better in-app safeguards and guidelines, for example to connect vulnerable users with professional care services.
So how should you use it?
Instead of asking AI to think and speak like a human, we should be using it to give us more room to think. That means using AI as a helpful therapist’s assistant.
Explore ideas, not answers
Ask AI for answers and it will oblige, even when it doesn’t have the professional skills to do so. You’d be better off appealing to its genuine strengths by using it as a researcher, not a confidante. Ask it to present options, share knowledge and guide you towards professional resources. In short, use it as a regular wellness app, not a pocket counselor.
Gain knowledge
Models such as ChatGPT, Gemini and Claude are becoming much more reliable as research tools, hallucinating less and linking to verifiable sources. That makes them great for breaking down complex medical topics into simple concepts. Use AI to explain ideas, explore strategies, and discover user-friendly illustrations of common topics such as ADHD, grief, depression, etc.
Shape habits
Just like your fitness or sleep app, AI can be very useful in developing positive habits, setting goals, and tracking progress. Use it to nurture your overall mental health with tips on meditation, journaling, Stoicism, conflict resolution and more. You don’t always have to have the answers. Sometimes all you need is a plan.
To Trust or not to trust?
The bots are getting better, and the barriers between AI and human are becoming harder to define. For the generation that grew up with smart devices, the distinction may already be meaningless. A staggering 70% of teens have experimented with AI companions or digital friends.
When it comes to therapy, however, we shouldn’t mistake the chirpy, instant conversation of an AI chatbot for the measured, informed response of a professional therapist. Until better regulation and review are in place, it’s safer to use AI as a preparation for therapy, not a substitute.




