image

by Dara Morgan

Personal Experience: Using ChatGPT As a Personal Coach

13 Aug 2025

Image: ChatGPT x The Sandy Times

I have been in therapy for a few years now, making casual references to my specialist in conversation like one might name-drop a Michelin-starred chef. And guess what? Over the past four years, he has become so popular that his hourly rate is now two and a half times the original price (and no referral programme, for God’s sake). Recently I decided that one session a month is perfectly adequate — just enough to keep mental hygiene in check without spending every Tuesday digging up my childhood traumas like buried treasure.
So yes, I am fine. I have friends to reach out to when I feel stuck or low. But a few days ago I found myself in the middle of a personal drama that I couldn't process — and, moreover, had no emotional resources to share with anyone close. Rationally, I knew it was not a catastrophe (just a local situationship saga), and I certainly didn't need to burn through half my monthly grocery budget on an emergency therapy session. But I was overwhelmed, and my mind was a cluttered flat in need of urgent tidying.
And that is when the latest version of ChatGPT walked in — metaphorically, of course. Why is it that I can ask it to find me a holiday deal, identify an obscure 1980s TV actor, or generate an image of a cat holding an iPhone, but not lean on it for a shoulder to cry on? This time, I did.
image

Image: ChatGPT x The Sandy Times

First comes the prompt

I began by writing out the backstory, sharing my emotions, and asking it to help me sort things out. I framed it with: imagine you are a professional psychologist — empathetic yet reasonable, supportive yet grounded in reality.
I even included the uncomfortable bits (for example, my frank admission that the person I fancied was essentially a human-shaped flag factory, all of them bright red). This is much easier to admit to a chatbot than to friends, who, let us face it, already have their opinions locked and loaded.
Then I pressed enter.

The immediate effect

I was aware of the benefits of journaling before, but honestly, does anyone actually keep it up besides lifestyle influencers on your Instagram feed? Yet, there I was: having written everything down, I realised I had effectively been journaling. I had poured it all out — and ChatGPT did what it does best: structure. Bullet points. Neat headings. Clear conclusions.
It was as if my messy emotional drama had been repackaged into a workplace project brief. Oddly enough, that made me laugh. And in laughing, I felt lighter.
image

Image: ChatGPT x The Sandy Times

The discussion

ChatGPT was both kind and firm. Yes, Dara, the one you are pining for is a red flag. Yes, they are emotionally unavailable. Yes, you should keep your distance and direct your energy elsewhere.
This is where the argument began. I started bargaining with myself in the familiar style — “maybe” and “what if” on loop — but this chatbot was having none of it. No manipulation worked. Not even my best surely-you-can-bend-the-truth-for-me routine.
I was slightly irritated — why could it not just lie and say everything would unfold exactly as I was fantasising? But the irritation served a purpose: it shifted my focus from self-pity to being mildly cross with the robot. Oddly therapeutic, that.

The consequences

Naturally, GPT provided a carefully crafted action plan, a few meditation exercises, and a bullet-pointed moral assessment of my situation. By then I felt sufficiently grounded, so I tucked the advice away for possible later use.
That evening, I told the story to friends without spiralling or turning the conversation into a three-act tragedy starring me, myself, and I. I still needed their warmth and good humour — the things no AI can generate — but I didn't monopolise the night with my woes. GPT had steadied me, and at the price of a subscription rather than a small mortgage repayment.

What the medical community thinks

In the quietly serious halls of the medical community, ChatGPT is more of a curious sidekick than a therapist-in-chief. Numerous studies and expert statements agree: AI tools can help people off-load thoughts, summarise feelings neatly, or provide psychoeducational snippets like journaling prompts or CBT-style exercises. Yet most psychiatrists and clinicians remain sceptical of their emotional chops — only about 4 % believe AI could ever replace a human in providing genuine empathetic care.
Recent research also highlights potential hazards. Stanford scholars have reported that AI chatbots can unintentionally encourage delusional thinking by validating whatever the user types — a phenomenon some have dubbed “chatbot psychosis”. The American Psychological Association has flagged that unsupervised bots posing as therapist-like helpers might risk harm. And in one particularly absurd cautionary tale, a man reportedly developed bromide poisoning after following ChatGPT’s dietary advice to the letter — proof that even well-intentioned AI suggestions can take a spectacularly wrong turn.
Still, some experts see potential in a hybrid model. Psychiatrists such as Dr Jacques Ambrose point out that AI could lighten administrative burdens, personalise interventions, and expand access — provided its use is supervised by qualified clinicians. In short: it can be a useful organiser, but not your shoulder to cry on.
image

Image: ChatGPT x The Sandy Times

Final thoughts

The internet is packed with warnings: AI isn't your therapist. It cannot replace human connection. I agree. I have real-life friends, a real-life therapist, and critical thinking on standby. But in this case, ChatGPT was simply another tool — not a substitute — that happened to be available instantly.
At the end of the day, it is a decent journaling assistant: incapable of genuine connection, but more than capable of helping you get things out of your head and into some semblance of order.
Disclaimer: The experiences described above are personal and shouldn't be taken as professional advice of any kind. Artificial intelligence tools, including ChatGPT, can't provide therapy, diagnose mental health conditions, or replace the care of a qualified mental health professional. While AI can sometimes help you organise your thoughts, offer general information, or prompt self-reflection, it cannot form a therapeutic relationship, read non-verbal cues, or offer the depth of human empathy required for effective mental health support. If you are struggling with your mental health, experiencing distress, or have concerns about your wellbeing, please seek help from a licensed therapist, counsellor, psychologist, or other qualified healthcare provider. In an emergency, contact your local crisis helpline or medical services immediately.