What “ChatGPT as Therapist” Reveals About Our Expectations of Therapy
- info822671
- May 9
- 2 min read
The rising popularity of ChatGPT as therapist highlights how people view therapy—and what they hope to gain from it. For many, ChatGPT has become a kind of "pocket therapist"—always available, instantly responsive, and free.
Why People Turn to ChatGPT for Therapy
Users often engage in ChatGPT therapy to organise thoughts, gain clarity, or—most often—receive advice. This behaviour reflects a widespread perception of therapy as a place where a wise expert listens, weighs all factors, and delivers the “right” answer.
Naturally, many potential clients are disappointed—or even shocked—to learn that this is not the true purpose of therapy. In fact, proper counselling generally avoids direct advice-giving.
The Deeper Need Behind Seeking Advice
The impulse to be told what to do can come from many sources:
A wish to avoid accountability
Feeling too overwhelmed or traumatised to act
Low confidence in one’s own judgement
A culture of instant solutions—ask a question, get a quick answer
The belief that so-called experts always know better
This belief assumes that a therapist—after just a few sessions—can somehow absorb and evaluate:
Your life experiences and personal history
Environmental, developmental, and educational influences
Upbringing, trauma, relationships, and media exposure
…and then distil all of that into a handful of wise suggestions that will resolve everything.
The Limits of “ChatGPT as Therapist”
Even if the advice were helpful, what happens when your next problem arises? Do you return for another answer? And the next time? The therapist—or the AI—becomes little more than a life guru, an oracle.
But has anything within you actually changed? Have you developed greater insight? Do you trust yourself more?
These are the questions therapy is meant to help you answer.
Can ChatGPT Therapy Evolve?
To be clear, I’m not dismissing the use of ChatGPT for personal support. Using it to sort through ideas or explore different perspectives can be helpful. But if we’re talking about actual therapy, we’re not there yet.
Perhaps a more helpful future lies in training AI not just to advise, but to ask constructive, therapeutic questions—questions that foster reflection, self-trust, and genuine growth.
That could be the first real step toward making ChatGPT therapy something more than just a digital advice machine.

Comments