Can I just use AI as my therapist?
Psychotherapy is one of the most intimate jobs in the world. AI uses predictive models rather than human presence. Can empathy be simulated? Can a chatbot take into consideration our body language, tone of voice and meaningful silences?
Me: I’m feeling really low at the moment.
ChatGPT: I’m really sorry you’re feeling this way. You’re not alone, even if it might feel like it right now. Do you want to talk about what’s been going on or what’s brought you here? I’m here to listen—no judgment, no pressure.
Above is an interaction I recently had with ChatGPT. The reply was warm and considerateand in that moment it made me feel heard and listened to. After talking with friends and colleagues, it seems many are using platforms such as ChatGPT to vent their problems and get support, whether that’s emotional or professional.
In 2024, AI was mostly being used as a different version of Google, for generic search and content creation found recent Charité study by Schäfer, Krause, and Köhler (2025), published in Frontiers in Digital Health ‘Exploring user characteristics, motives, and expectations and the therapeutic alliance in the mental health conversational AI Clare®: a baseline study’. In 2025, the top use of AI is relational as millions of people are turning to it for emotional support.
The study found that the top client motive for seeking out AI advice was that it let users avoid the ‘embarrassment of talking about their problems in person’. The highest-ranking expectations clients had of AI was that it wold provide ‘emotional support’ and they could ‘get things off their chest’.
The apparent positives of this is that it’s convenient, cheap, immediate and doesn’t carry the potential awkwardness of sharing one’s embarrassing or shameful thoughts and feelings with another human.
On the flip side, it can be over-validating and simply mirror instead of challenge. This could lead to reinforcing unhelpful patterns, delusional thinking and even psychosis in those with a predisposition. The study also found that higher loneliness in users correlated with stronger bonding with AI. This emphasises the importance of considering dependency and. unhealthy attachments in users with mental health issues.
When something provides instant gratification, with no having to wait a week for a session or even having to wait your turn to speak, I wonder what impact that will have with how we relate to each other. Will human relationships feel frustrating and unsatisfying compared tothe immediate hit of online intimacy provided by an online intelligence?
In terms of data protection, I would also be worried about safety and ethical challenges, including preserving user autonomy, reducing manipulation risks, and ensuring proper user-technology relationships and privacy.
For me, it only took a few weeks to see that ChatGPT was willing to agree with anything I said and this was only useful up to a point. I need a human perspective, someone who can relate to my experiences and feelings because maybe they have felt and survived similar ones themselves. This I can take comfort and learn from.

Leave A Comment