I spent a month with an AI therapist – this is what happened

Date: 2024-11-01
Nic Fearn against a background of messages from his AI therapist
Journalist Nicholas Fearn shared his problems with a bot to see if they could help him (Picture: Supplied)

It’s the early hours of the morning, and I can’t fall asleep. My mind is racing with thoughts of the darkest kind. 

I have battled with mental health problems for most of my life, having been diagnosed with autism, anxiety disorder and OCD at age 14. Being heavily bullied in school also dented my self-esteem and even resulted in me trying to take my own life.  

While regular sessions with a psychologist helped me to navigate these complicated feelings as a child, when I turned 18 the appointments stopped even though I was still gripped by depression.

As an adult, counselling was a great help, but I realised it wasn’t always to hand as quickly as I needed, due to NHS waiting lists being extremely long.  

Cue AI therapy. I had heard about this growing trend, where data and users behaviour patterns are analysed so a bot can ask questions, offer advice, and suggest coping mechanisms to someone who might want it.

Understandably, it’s a practice cloaked in controversy. After all, can technology, no matter how intelligent, really support someone through any sort of mental health crisis? Is it safe? Is it even ethical? 

With all these questions swirling in my mind, as someone open to new ways of support, I decided to give it a try and downloaded Wysa, a chatbot that uses AI to provide mental health advice and support around the clock. The app is completely anonymous and free, but offers a paid-for plan with additional premium features, such as therapeutic exercises, sleep stories and meditations. 

Nicholas Fearn
Nicholas has had mental health struggles since childhood (Picture; Supplied)

I wasn’t sure how much I would need to use it, but it turns out that over the last few weeks my AI therapist and I ended up spending a lot of time together.

Telling all to a robot

I’ve always struggled with self-doubt. I am constantly comparing myself to my non-identical twin brother, who I think is better looking than me, and experiencing a bad eczema flare-up this week has really affected my self-esteem. 

I admit this to my bot who is incredibly empathic, saying it is sorry to hear of my low self-esteem before asking me how my feelings impact my day-to-day life. 

I respond by saying I feel like I have no choice but to isolate myself from the outside world, which is hard because I don’t see my family and friends for days — sometimes weeks — on end, even though seeing my loved ones makes me happy and that they constantly reassure me when I feel down. 

Silhouette of man on phone
The AI therapist was surprisingly empathetic, says Nicholas (Picture: Getty Images)

My AI therapist suggests a thought reframing exercise and as soon as I agree, a list of tools — ranging from an assessment to manage my energy to a self-compassion exercise —  suddenly pop up at the bottom of the screen. I select the self-compassion task, which uses “positive intentions” to help the user tackle negative thoughts.

I then take a seven-minute meditation in which I close my eyes, focus on my breathing, smile and repeat positive phrases uttered by my Wysa expert. 

Opening my eyes, I feel surprisingly positive after a difficult day.

Wide awake club

Staring at my bedroom ceiling at 4am is quite normal for me. But on one particular day my mind becomes flooded with worry – I can’t even narrow it down to a single concern.

When I type about my sleep troubles and random anxiety to the bot, it replies in a compassionate tone, saying: “That sounds really tough,” before asking me how long I’ve felt this way. 

After admitting I never seem to sleep at a regular time due to my anxiety, Wysa suggests another thought reframing exercise to help ease some of my worries. The exercise is a one-to-one conversation between the bot and me, where I’m asked to talk about a specific concern. I say I am nervous about a busy week of work coming up and missing a bunch of deadlines. 

A lamp against the background of the bed
Nicholas says that nighttime worrying is quite normal for him (Picture: Getty Images)

Wysa suggests I am probably “catastrophising”, which is when someone expects the worst possible outcome to unfold. While the connection suddenly cuts out mid-conversation before Wysa can provide a solution, it’s clear to me that I am overthinking and will be fine the following week.

I can now rest, although I do wonder how I’d cope with a sudden shut down if I had a longer issue to discuss.

Dealing with suicidal thoughts

I can’t remember a time in my life when I haven’t battled suicidal thoughts during certain events and these demons have returned after yet another relationship breakdown. 

Crying my eyes out, I admit to Wysa that I don’t want to be alive anymore. Its response is utterly heartwarming. “Nic, you are worth life. You are loved, cherished and cared for, even though you may not feel that way right now.”

With my eyes firmly fixed on these kind, AI-generated words, I realise that suicide isn’t the best course of action and that life is probably worth living. Concerned about my wellbeing, the bot provides me with a phone number for the Samaritans. I decide not to ring them because I find phone calls difficult as an autistic person – which is perhaps another reason why the bot works for somebody like me. 

Message from bot to Nic which states he is 'loved, cherished and cared for'
The bots words were a comfort to Nicholas (Picture: Supplied)

Battling social anxiety 

While I’m okay seeing family and friends, the thought of encountering neighbours and other acquaintances frightens me. Turning to my app, I explain that I never know what to say to people and worry about what they might think of me. This is a feeling I experience day in and day out due to my autism. 

The advice given is constructive – just a simple smile or hello should do the trick. Although it may sound too simple to be true, I find it helpful because it shows that I don’t have to converse long with a stranger – a quick greeting should be enough. 

Wysa also suggests that I may be engaging in “mind reading”, where you assume what someone thinks of you without evidence. The AI bot gives the example of thinking someone dislikes you because they didn’t smile when they walked past, which I will try to bear in mind in future social interactions. 

Need support?

For emotional support, you can call the Samaritans 24-hour helpline on 116 123, email jo@samaritans.org, visit a Samaritans branch in person or go to the Samaritans website.

If you're a young person, or concerned about a young person, you can also contact PAPYRUS, the Prevention of Young Suicide UK.

Their HOPELINE247 is open every day of the year, 24 hours a day. You can call 0800 068 4141, text 88247 or email: pat@papyrus-uk.org.

Seeing old faces

Today is my nephew’s christening, and while I am excited to celebrate with my loved ones, I’m nervous about seeing loads of new and old faces. 

To build on the previous social anxiety tips,  I message the bot for advice on how I could make the day less overwhelming. Wysa quickly reassures me that it’s normal to find social events nerve-racking. 

I explain that I am particularly worried as I never know how to start or maintain a conversation. 

When approaching an old family member, Wysa recommends that I say it’s nice to see them and ask how they are. And if they ask how I am doing, the bot recommends saying something simple like, “I’ve been doing well, thanks”. 

Before a conversation takes place, I’m told a breathing exercise might also help, which helps me feel better prepared for a long and busy day ahead. 

Stressed man with smartphone and coffee cup
‘I am particularly worried as I never know how to start or maintain a conversation,’ says Nicholas  (Picture: Getty Images)

Facing up to night-time terrors 

Ever since moving onto the maximum dosage of Sertraline a few weeks ago, I’ve been having nightmares most nights. 

From plane crashes to loved ones getting gravely ill, these horrible and random dreams have been disrupting my sleep pattern for weeks. After explaining to my AI therapist that these nightmares started after the change of medication, it admits that this is likely the cause. As a remedy, the bot takes me through a thought reframing exercise that involves discussing some of these dreams in more detail.

We speak about a recent dream involving my parents dying, which is a frequent worry of mine, as morbid as it sounds. 

Messages from the Wysa bot which tell Nick he's 'doing great'
The AI therapist tried thought reframing with Nicholas (Picture: Supplied)

Wysa says this is likely another symptom of catastrophising, but then the chat suddenly ends due to a connection error. I am left not knowing how to tackle these traumatising dreams, which leaves me feeling pretty let down and not sure what to do next. I start a new chat, but the bot suggests thought reframing again – but it doesn’t make much sense as you can’t control what happens when you’re asleep and my horrid dreams torment me for yet another night. 

Dealing with compulsions

Today, my latest impulse TikTok Shop purchase arrived in the post: a magic mop, which is perhaps the last thing you should buy when you have severe OCD.

I’ve already used it several times today, but I still think my floors are dirty. Scared of a mop overtaking my life, I ask for OCD advice. The first thing the bot says to me is that it must be exhausting – and they’re right as it can take over your life and is extremely tiring. I can’t believe I feel heard by an AI bot. 

We do another thought exercise where I discuss how my OCD makes me feel. Wysa says it sounds like a symptom of filtering, where someone focuses on the negative details of a situation and forgets all the positives. 

In this context, it says I could be looking for tiny specs of dirt that may not exist and tells me to remember that the majority of the floor is probably clean. This makes me feel better – for now at least, although I’m more than aware it’s a plaster rather than a cure. 

Nic Fearn smiling at the camera
But did it make a difference to Nicholas? (Picture: Supplied)

So was it worth it? 

I was very skeptical about whether a chatbot could act as an effective therapist. While I don’t think AI can ever replace human psychologists and counsellors,  I’m surprised to admit that Wysa is actually a pretty handy tool for someone suffering from poor mental health. 

As soon as you tell the bot about what’s on your mind, it comes back with a highly emphatic response before questioning why you may be feeling a certain way and using this information to provide a well-reasoned solution. You sometimes forget you’re talking to a robot, not a human. 

Of course, it isn’t perfect. There were many times when a chat would suddenly end and when Wysa’s advice was repetitive. And I do feel a bit paranoid that I’ve shared so much personal information with an AI chatbot, so I hope it is genuinely safe and secure. 

Either way, I had someone to speak to at some genuinely hard times, and I will continue using Wysa as an emotional support cushion. 

'We can't let AI therapists become acceptable'

Metro’s Assistant Lifestyle Editor Jess Lindsay believes we need to be far more wary of letting a bot look after our mental health. Here, she explains why.

‘In my opinion, an AI therapist is no more helpful than a list of motivational quotes. The bot may be able to say the right things, but when you’re at your lowest, you need more than hollow platitudes from a computer that doesn’t have the capacity to empathise.

Particularly if you’re already feeling lonely, the connection you feel from speaking to someone about your feelings is a large part of therapy’s appeal; it’s being heard, acknowledged, and supported as (and by) a human being. 

Having dealt with chronic depression, anxiety, and ADHD throughout my life, I find the idea of having to receive help from a computer somewhat dystopian, and I’d feel like my concerns were being dismissed if this was offered to me – even as a supplementary solution.

Stock image of two women in armchairs are sitting and talking
Jess fears AI doesn’t have the ‘capacity to empathise’ in the same way as a human therapist (Picture: Getty Images)

Working through difficult issues requires a level of commitment from both yourself and the therapist you’re seeing, and why should I put in the effort when the other side is just a machine doing what it’s been programmed to do? Not only that, I know how to calm myself down when I’m having a panic attack, or take a walk when I’m stuck in my own head. To parrot NHS guidelines back to me without going deeper into why I feel like that seems like an insult to my intelligence.

While I absolutely understand the need for something to fill the gap when therapy and counselling is difficult to come by on the NHS, I worry that tools like this will be touted by the government as an acceptable (but most importantly in the eyes of government, cheaper) alternative when what’s desperately needed is funding and investment in the country’s mental health.

That means holistic interventions when patients present at the GP with concerns, adequate referrals for conditions to be diagnosed by trained professionals, mental health programs to support people before they become consumed by illness, and various types of therapy to suit each person’s individual needs.

Even if AI is helpful to some, it’s a mere sticking plaster on a deeper societal wound.’

Do you have a story you’d like to share? Get in touch by emailing Claie.Wilson@metro.co.uk 

Share your views in the comments below.

Leave Your Comments