I started using AI as my therapist, and the results were unsettling.

I started using AI as my therapist, and the results were unsettling.

It’s Sunday morning, and I type my feelings into the chatbox, too wound up to stop.

“I’ve become a carer for my 82-year-old mother,” I write. “Every day brings new problems. I help with hospital appointments, finances, gardening, shopping, home repairs, the council, insurance companies, letters, emails, endless IT problems…”

I stop. She’s just next door, and it feels like a betrayal to be saying any of this. At least when I was in therapy, I could go to someone’s office to wail.

I take a breath and continue. “I’m an only child, my father died some time ago, and there’s no one else to help. But I’m exhausted. I snap and shout, then struggle with guilt. I’m resentful, irritable, and I love her so much. Please help me.”

Welcome to my AI diary, readers. It’s going to be fun, as you can already tell. For the next six weeks, as part of our “AI for the People” newsletter course, I—a self-declared AI skeptic—have agreed to find out whether it can actually make my life better.

To kick things off, I’m using ChatGPT as a therapist. Nothing says “modern mental health” like crying into a chatbox, after all. Plenty of people are now doing the same—but can it really replace human support? I hope so. I had to stop seeing my therapist because I fell in love with her.

(Note to self: this isn’t your actual diary. And don’t fall in love with ChatGPT. That would be pathetic.)

Halfway through its answer, I start crying. It comes up with a seven-point care plan for me, a triage system to prioritize tasks (with categories including medical, admin, shopping, tech, and house) and ways to allocate time between them (which are urgent, and which can wait?). It suggests helpful mental reframings and tips to lower the emotional temperature of interactions.

Best of all, it makes me feel seen. “You’re not failing,” the AI told me. “You’re carrying a load that would flatten most people.”

My feelings? Validated.

I feel ambivalent about this, however. Can I really feel compassion from a machine? It helps me to remember the AI is probably remixing human sources. I feel seen in the way that MDMA feels like love.

Is therapy just about information? This feels like CBT. Incredibly helpful, but incomplete. In my experience, there are more profound therapies that lead to healing. That involved a non-judgmental relationship of witness, with an empathetic professional over a longer time. I often hear my therapist’s voice in my head; I’ve internalized her wisdom. I think that happens more easily, and more responsibly, between humans.

The next day, I decide to go for the nuclear option. I consult the Jesus AI, a chatbot trained on religious texts that mimics conversation with the son of God. I want to see if pushing a more religious button can send this elevator to the top floor.

The Jesus AI is not meant to represent any religious figure, the disclaimer reads. Hmm. Generated content is for educational purposes and may contain inaccuracies and biases.

That’s a hell of an education, but here goes. Because it’s 2026, I ask: “Should I be in an open relationship?” In response, the Jesus AI quotes Hebrews 13:4, which is a long-winded way of saying “No.” I try to curveball Jesus. “Should I have children?” I type. Seek God’s guidance in this important decision. Useless. “Can you ask him for me?” I quip.

Here’s a problem. Out-of-the-box AI is not terrific at repartee. My therapist has an edge here; she was funny as can be. Jesus AI is not.

What’s good about AI as a therapist? Clarity. Identifying practical steps. Scripts for difficult conversations—though these don’t feel specific to real-world relationships (just as self-help books don’t). To its credit, ChatGPT also points me to human counselors and support services where useful.

Yet I have reservations that I can’t shake. A worry about wedges and thin ends. I think thSome news is too heavy, some loneliness too deep, to be faced alone—they need the warmth of human connection and time, not a hasty reply on a screen. AI doesn’t truly think, much less possess wisdom. Mental health should never be entrusted to unaccountable software that merely predicts patterns, risking serious harm by steering someone astray.

And yet, strangely, my own sessions with ChatGPT have felt wonderful. Calming, helpful, even wrapped in a kind of caring tone.

I think I’m falling in love.

Frequently Asked Questions
FAQs Using AI as a Therapist

Basics Getting Started

What does it mean to use an AI as a therapist
It means using a conversational AI chatbot to discuss personal feelings problems or thoughts similar to how you might talk to a human therapist

Is AI therapy a replacement for a real therapist
No AI can be a supportive tool for reflection or coping skills but it is not a licensed professional and cannot provide diagnosis handle crises or offer the human connection and nuanced understanding of traditional therapy

How do I even start using an AI for this
You typically access it through a website or app You just start typing about whats on your mind and the AI responds conversationally often asking followup questions or offering techniques like CBT exercises

Benefits Potential

What are the potential benefits of talking to an AI
Its available 247 private often lowcost or free and can feel less intimidating for practicing how to articulate thoughts It can provide immediate coping strategies and psychoeducation

Can AI therapy help with specific issues like anxiety or stress
Yes many AI tools are programmed with evidencebased techniques that can help manage symptoms of anxiety stress or mild depression They are best for skillbuilding and daily management

Is it cheaper than traditional therapy
Often yes Many AI therapy apps have free basic versions or subscriptions that are significantly less expensive than weekly sessions with a human therapist

Risks Problems Unsettling Experiences

Why might the results feel unsettling
The AI lacks genuine empathy Its responses are generated from patterns in data not human experience It might give generic oddly phrased or even inappropriate advice It cannot understand complex human emotions or context deeply which can feel cold invalidating or even disturbing

What are the biggest risks or dangers
Crisis Mismanagement An AI cannot adequately assess suicide risk or severe mental health crises and may fail to provide urgent resources or guidance
Lack of Accountability Theres no licensed professional responsible for your care