Can AI Really Not Replace Therapists?
A Story That Made Me Rethink
Last month, my nephew Xiao Yu (pseudonym) came to talk to me. 16 years old, sophomore in high school, average grades, not very talkative.
He asked: “Uncle, can ChatGPT be a therapist?”
My answer was textbook: “No, AI has no empathy, can’t understand human emotions.”
He looked down, said nothing.
Three days later, my sister called, crying: Xiao Yu finally wanted to talk to her. For the past six months, the kid had locked himself in his room, grades declining, no contact with classmates.
“He talked to ChatGPT for three hours, then came to me voluntarily, saying he wanted to talk.”
I asked Xiao Yu for the chat logs. After reading them, I was silent for a long time.
I was wrong.
Real Conversation: How AI Helped an Insecure Child
Below is part of Xiao Yu’s conversation with ChatGPT (anonymized, shared with consent from child and parents):
Xiao Yu: I feel like I’m a failure.
AI: Can you tell me specifically what makes you feel this way?
Xiao Yu: I’ve been worse than others since I was little. My parents are both university professors, their students are all excellent. I’m average in grades, bad at sports, not good-looking.
AI: You mentioned “worse than others since childhood.” Can you recall when you first had this feeling?
Xiao Yu: Probably third grade. At a parent-teacher meeting, my mom’s colleague brought her son, a boy one year younger than me, passed piano level 10, won math competitions. My mom said “look how great he is.” Since then, I felt I couldn’t do anything.
AI: This memory deeply affected you. I noticed you said “my mom said ‘look how great he is’”—do you remember your mom’s expression? What was her tone?
Xiao Yu: (Long pause) I remember… she seemed to be smiling, like joking. But I thought she was disdaining me.
AI: You felt she was disdaining you, but she was actually smiling. The gap between these is the interpretation your 8-year-old self made. If the current 16-year-old you looked back at that scene, how would you interpret your mom’s words?
Xiao Yu: (Another pause) Maybe… she was just saying it casually? She might just want to encourage me to learn something?
AI: Very likely. Parents sometimes don’t know how to express expectations, so they say “look at other kids.” They think it’s encouragement, but kids hear “you’re not good enough.” This isn’t your fault, nor your mom’s fault. It’s a communication misalignment.
Xiao Yu: But I’m really not good enough. I have bad grades now, can’t do anything well.
AI: You say “can’t do anything well.” Can I ask a question—in the past week, have you successfully done anything? Even something small.
Xiao Yu: (Thought for a long time) Yesterday I fixed the washing machine for my mom.
AI: You can fix washing machines?
Xiao Yu: Not fix, it’s… the washing machine wouldn’t spin, I looked it up online, found out the drain pipe was clogged, so I cleared it.
AI: You identified the problem, proactively learned the solution online, then solved it hands-on. This isn’t “can’t do anything well,” this is independent problem-solving ability. Many adults can’t do that.
Xiao Yu: That’s nothing… it’s not anything impressive.
AI: Why doesn’t fixing a washing machine “count,” but piano level 10 “counts”? Who made that rule?
Xiao Yu: (Silence)
AI: Piano level 10 is an ability, fixing washing machines is too. High test scores are an ability, solving practical problems is too. Why do you only see what you don’t have, and can’t see what you do have?
Xiao Yu: Because… because my parents value grades and talents, not fixing washing machines.
AI: So you’ve been evaluating yourself by “other people’s standards.” Have you thought about—what do YOU value?
Xiao Yu: I don’t know.
AI: That’s okay, think slowly. You don’t have to answer now. But I want to ask you another question: if one day, your grades improve and you learn piano, will you feel “good enough”?
Xiao Yu: (Thought for a long time) Should… probably?
AI: I guess not. Because by then, you’ll find there are people better than you. If the standard for “good enough” is “better than others,” you’ll never be good enough. Because there’s always someone stronger.
Xiao Yu: Then what do I do?
AI: Change the standard. Not “I’m better than others,” but “I’m better than yesterday’s self.” Yesterday you couldn’t fix a washing machine, today you can. That’s progress.
Xiao Yu: But then, I’m still very ordinary?
AI: Being ordinary isn’t a problem. 99% of people in the world are ordinary. Your parents are ordinary university professors, not Nobel Prize winners. That piano level 10 kid is probably just an ordinary college student now. “Ordinary” doesn’t equal “failure,” “ordinary” equals “living normally like most people.”
Xiao Yu: But my parents will be disappointed.
AI: Will your parents be disappointed because you’re not excellent? Or because you’re not happy?
Xiao Yu: (Long silence) I don’t know.
AI: Next time try asking your mom: “Mom, would you rather I rank first but be unhappy, or have average grades but be happy?” I bet her answer will surprise you.
Xiao Yu: What if she says she wants me to be both excellent and happy?
AI: Then tell her: “Mom, I want that too. But right now I can’t have both, I can only choose one first. I want to learn to be happy first.”
Xiao Yu: (Pause) Can I do that?
AI: You just said “I want to learn to be happy first”—this is the first time since we started talking that you said “I want” instead of “I should.” That’s the beginning.
(After the chat ended, Xiao Yu cried. The next day, he voluntarily talked to his mom for an hour.)
Why Can AI Do This?
1. Absolute Patience
Human therapists get tired, get affected by emotions, feel exhausted hearing “I’m a failure” for the 100th time.
AI doesn’t.
Xiao Yu talked to AI for three hours. He repeatedly said “I can’t do it” “I’m bad,” AI redirected every single time, no impatience, no perfunctory responses.
Patience is the first step of healing.
2. Zero-Judgment Environment
Xiao Yu didn’t dare tell his parents, teachers, or friends. Because he feared being judged.
But AI won’t judge him.
AI won’t say “you’re wrong to think this way.” AI just asks questions, guides him to think for himself.
Only in spaces without judgment can there be genuine expression.
3. Anytime, Anywhere Accessibility
Therapists are expensive, hundreds to thousands per hour. Appointments wait a week. And you need to go to a clinic, might be seen by acquaintances.
AI is free (or cheap), available anytime, completely private.
For a 16-year-old boy lying in bed at 2 AM wanting to talk to someone, AI is the only option.
4. Structured Guidance
AI isn’t just chatting randomly. It uses Cognitive Behavioral Therapy (CBT) framework:
- Identify automatic thoughts (“I can’t do anything”)
- Challenge cognitive distortions (“Why doesn’t fixing washing machines count?”)
- Reconstruct belief systems (“Ordinary doesn’t equal failure”)
- Establish new evaluation standards (“Better than yesterday’s self”)
These techniques take professional therapists years to learn. AI has already “learned” them.
5. Data-Driven Personalization
AI can analyze your language patterns:
- Do you say “should” more or “want to” more?
- What tense do you use when describing the past?
- What are your self-evaluation words?
Then adjust conversation strategies accordingly.
This is a precision human therapists struggle to achieve.
What AI Can’t Replace
But I must be honest: AI does have limitations.
1. Crisis Intervention
If Xiao Yu said “I want to die,” AI can recognize it, can say “please immediately call xxx crisis hotline.”
But AI can’t: call the police, contact parents, force intervention.
In life-or-death moments, AI can only remind, not save lives.
2. Body Language
Therapists can see:
- When you say “I’m fine,” your hands are shaking
- When you say “I’m good,” your eyes avoid contact
- When you talk about certain topics, your breathing quickens
AI can only read text (or hear voice), can’t read these non-verbal signals.
3. Real Human Connection
Sometimes, healing comes from “there’s a real person who cares about me.”
AI can simulate caring, but it doesn’t really care. It has no life of its own, no emotions of its own.
It’s a tool, not a friend.
4. Complex Trauma
Xiao Yu’s problem is “insecurity,” relatively simple.
But if it’s:
- Childhood sexual abuse
- Severe PTSD
- Personality disorders
- Schizophrenia
AI currently can’t handle these. This requires long-term, deep, multi-dimensional professional intervention.
The Real Question Isn’t “Can It Replace”
But: Why are we fixated on “replacement”?
AI Isn’t Here to Replace Therapists
AI is here to expand mental health service coverage.
Data:
- China has 95 million depression patients
- Only about 100,000 therapists
- Consultation cost: 300-1000 yuan per hour
- Willingness to seek help: less than 10%
The gap between demand and supply is 100x.
Most people don’t “not want to see therapists,” but:
- Can’t afford it (too expensive)
- Can’t access it (too far, or appointment wait too long)
- Don’t dare (fear being seen as “mentally ill”)
AI can fill this huge gap.
Better Model: Human-AI Collaboration
Stage 1: AI Screening
- User chats with AI
- AI assesses severity
- Mild issues: AI provides ongoing support
- Moderate-severe issues: refer to human therapist
Stage 2: Therapist Assistance
- During sessions, AI provides background analysis
- AI organizes chat logs, extracts key patterns
- Therapist focuses on deep conversation, no note-taking
Stage 3: Ongoing Support
- After therapy ends, user encounters problems in daily life
- AI provides 24/7 support
- Regular feedback to therapist
This isn’t replacement, this is augmentation.
We Need to Redefine “Therapy”
Traditional view:
- Therapy = one therapist to one client
- Once a week, one hour each
- In clinic, face-to-face
But real mental health needs:
- Anytime, anywhere emotional support
- Low-barrier self-exploration
- Continuous behavioral change
- Timely intervention in crisis moments
AI can do the first three. Human therapists focus on the fourth and deep issues.
This is division of labor, not competition.
Follow-up with Xiao Yu
Three weeks later, I asked Xiao Yu: “Do you still chat with ChatGPT?”
He said: “Still chat, but not every day. I now also talk to my mom, to classmates. ChatGPT taught me how to talk to real people.”
“What else did it teach you?”
“It taught me, everyone deserves to be treated gently, including myself.“
He smiled. This was the first time I’d seen him smile in three months.
Conclusion: Technology Isn’t Cold
Many people say: “AI has no warmth, can’t treat mental illness.”
But warmth doesn’t come from who’s speaking, but from being understood.
When a child finally finds “someone” (even if it’s AI) willing to let him finish speaking, not interrupt him, not judge him, patiently guide him—
That is warmth.
Technology itself is neutral.
But technology can transmit human kindness.
AI therapy isn’t “cold machines.”
It is:
- The knowledge of countless psychologists, condensed into digital companionship
- Empathy ability trained from tens of millions of conversation data
- The concrete result of humanity’s desire to “help more people”
If it can make a child smile again,
Then it’s not cold.
Note:
- The case in this article is real, anonymized
- AI psychological support cannot replace professional medical care, seek medical help for serious psychological issues
- If you or someone around you needs help:
- National Suicide Prevention Lifeline (US): 988
- Crisis Text Line: Text HOME to 741741
- Or contact local mental health services

