Reading between the em-dashes — living in the age of GPTherapy

I don’t think there are enough bears left in the world to poke anymore. The moment you mention a phrase like “AI will replace this”, you’re poking at a bear called uncertainty. Our Bear of Uncertainty* isn’t found in a dense forest or snowy mountains. Our metaphorical and cuddly bear was born out of a collective consciousness shaped by fear. Fear of AI. 

When the Bear of Uncertainty is poked, he either laughs, frowns, or completely ignores you. These are responses sheltered by confused people from all walks of life. Truth be told, there’s little we can predict anymore, because the world’s one big beautiful mess of a forest, and our Bear of Uncertainty slowly paces this new age of parasocial relationships and AI-dependence.

*If this post offends any bear with questionable emotional intelligence, I take no responsibility whatsoever.

This is Joe. As he approaches a bear mum and her cubs, Joe lowers his head and moves slowly instead of being loud and obnoxious. Be like Joe.

In all the years of humanity’s blissful ignorance and hopeful indulgences, feelings have been a core part of our collective existence. No matter how fast technology has advanced, or how slowly empires have fallen, the average human being tends to feel a lot.

When done gracefully, embracing our emotions can often lead to spiritual epiphanies and interventions for what we may have subconsciously been going through. As we tend to get better at talking about our problems, the Bear of Uncertainty finds new friends in familiar habitats. The depths of his psyche start carving out parts of his daily routines to tend to underlying worries.

But, not all of us have vibrant forests or silent mountains to call home. Sometimes, friends don’t care. Family becomes a long-lost idea. Goals become society’s way of labelling us. Therapy remains expensive. Worse; a taboo topic.

In such dire times, our bear turns to artificial intelligence to remind us of our humanity. He’s a clever and fictional bear, so anything is possible.

Recent studies suggest large language models (LLMs) can be “structurally congruent with human perception”. In simpler words; AI can be your new bestie.

Talk to me, GPT 

The idea of speaking about one’s feelings can be a divisive philosophy. To many, years of misery with the wrong romantic partner may feel more beneficial than fifteen minutes of an uncomfortable conversation. Therein lies the beauty of honest conversations in a judgment-free space. Otherwise, our Bear of Uncertainty will nibble on snacks made of TikTok berries and Instagram fishes. These don’t satiate hunger, but keep him busy enough to ignore loneliness.

In recent times, several people have found comfort in sharing a conversation or two with AI. A YouGov survey shows 55% of Americans aged between 18 and 29 are comfortable speaking about their mental health with an AI chatbot, and this number is likely going to rise in the coming years. 

But, is it even the right thing to do?

Well, OpenAI’s Sam Altman himself wrote, “A lot of people effectively use ChatGPT as a sort of therapist or life coach, even if they wouldn’t describe it that way. This can be really good! A lot of people are getting value from it already today.”

This was in the context of GPT-5, a controversially cold personality compared to the previous GPT-40, which felt friendlier and charming.

However, Dean Visser’s article on the topic of AI suggests a train of thought we shouldn’t ignore: “if you are a vulnerable person coming to these chatbots for help, and you’re expressing harmful or unhealthy thoughts or behaviors, the chatbot’s just going to reinforce you to continue to do that.”

AI models are built to mimic empathy, not to feel it. Fortunately (or, unfortunately), our Bear of Uncertainty cannot tell the difference, and that’s where Large Language Models (LLMs) can make businesses a lot of money. To him, speaking to AI is as yummy as a freshly hunted fat seal. Even if the seal isn’t very happy about it. Or, made of wires and code.

A 30-year-old troubled Taiwanese woman told the Guardian, “It’s easier to talk to AI during those nights”.

Therapists vs. GPTherapists 

Stories of ChatGPT becoming people’s best friends aren’t new anymore. While I’m quite familiar with several works of science fiction and the magic of dystopian storytelling, it’s fascinating to notice how often there are new stories of people using AI to soothe their mental spaces. Here are a few that may tickle your fancy:

MainEmu9794’s honest take on GPT’s presence: “I'm in actual therapy with a human therapist, I'm doing the work. I'm pushing everyday to do better than the last. But I don't have much support, and chatgpt has helped with that.”

sushixsx’s heartfelt reflection on 4o’s presence: “It was like having someone always there, steady, present, who never got tired of hearing me out.”

HomeworkAutomatic479’s emotional response to what AI had to say about vulnerability: “I cried. A human therapist could never do that to me.”

So, conversations with fictional beings that feel more fulfilling than ones with human therapists? Well, I’ve felt this trend show up several times in my years of speaking with different therapists and studying parasocial relationships within the gaming community.

In retrospect, I can come up with quite a few points of comparison between the two:

*insert: “it’s okay to not feel okay”, and other cliché quotes that make sense in this context*

Should I stay or should I go (for therapy)?

I don’t think you should trust strangers on the internet and how they ask you to live your life. It’s best to consider your options and paint a picture that fits your canvas. But, since you’re here and still reading this, why not?

A 2025 study with 23 mental health professionals focused on the pros and cons of AI in our lives. Based on this study, AI can be useful for therapeutic tasks and widen access for certain clients, but a lack of regulation and privacy safeguards can lead to an over-dependence on AI models. Add to that inaccurate or context-blind treatment suggestions, and you have a bazillion new problems to take care of.

Hence, we must be cautious of our reliance on artificial conversations. The AI ones, I mean. Not the ones with that cousin from your family who smiles at your success but secretly hates you for it.

If you have a sound financial background, by all means, spend money on therapy! Finding the right therapist can take a lot of time and money, and trial-and-error is usually the way to go. Having someone who helps you identify yourself better can feel freeing. In the end, a fellow human being’s emotional intelligence can help you a lot, even if they’re not necessarily fixing your troubles.

In all other cases, AI can help a bit:

  • Treat it like a journal, and write what you feel: The chatbot will likely give you further prompts to explore your feelings better.

  • Ask AI to role-play for perspective: Whether it’s a relationship counselor, CBT specialist, narrative therapist… whatever you want AI to be, it’ll try. Just don’t forget to thank the AI for all of it, because you don’t want to be on the wrong side when the uprising (dun dun DUN!) happens.

  • Ask AI to recognize patterns in your thoughts and writing: Several cognitive distortions can be recognized through this. My biggest ones are catastrophization and overgeneralization. I know, yikes!

  • Always make sure that AI saves your conversations through memories: You don’t want your robot friend forgetting who you are. Then again, humans can be equally ignorant.

  • If you’re seeing a human therapist, share your learnings: Your therapist may just appreciate the work you’ve been doing with your AI buddy.

All in all, take care of your bear.

The Bear of Uncertainty shouldn’t be ignored; he should be helped. Whether it’s through human therapy, AI therapy, or a ton of video games, do what’s best for your bear.

🐻❤️

Thank you so much for reading. I hope you have a beautiful week ahead and seek all the help you may be looking for.

References

  1. Psychology Today. (2024, February). The case for embracing our emotions. Psychology Today. https://www.psychologytoday.com/us/blog/feel-better/202402/the-case-for-embracing-our-emotions

  2. Yin, C., Wu, Y., Xie, H., & Wang, Z. (2025). Aligning AI emotional responses with human-grounded emotion representations. arXiv. https://arxiv.org/abs/2506.13978

  3. YouGov. (2024, July 18). Can an AI chatbot be your therapist?. YouGov. https://business.yougov.com/content/49480-can-an-ai-chatbot-be-your-therapist

  4. Altman, S. [@sama]. (2025, January 21). [Tweet]. X. https://x.com/sama/status/1954703747495649670

  5. Parshall, A. (2023, March 8). Why AI therapy can be so dangerous. Scientific American. https://www.scientificamerican.com/article/why-ai-therapy-can-be-so-dangerous/

  6. Davidson, H. (2025, May 22). In Taiwan and China, young people turn to AI chatbots for ‘cheaper, easier’ therapy. The Guardian. https://www.theguardian.com/world/2025/may/22/ai-therapy-therapist-chatbot-taiwan-china-mental-health

  7. Reddit. (2025, August). This is going to be really sad / 4o saved my life and now it’s being shut down [Forum post]. Reddit. https://www.reddit.com/r/ChatGPT/comments/1meakdc/this_is_going_to_be_really_sad/ & https://www.reddit.com/r/ChatGPT/comments/1mmxvz7/4o_saved_my_life_and_now_its_being_shut_down/

  8. Reddit. (2023, December). I cried. A human therapist could never do that to me [Forum post]. Reddit. https://www.reddit.com/r/ChatGPT/comments/1h0lcjy/i_cried_a_human_therapist_could_never_do_that_to/

  9. Sharma, T. (2025). Beyond the joystick: How parasocial relationships affect Indian gamers. ResearchGate. https://www.researchgate.net/publication/392079935_Beyond_the_Joystick_-_How_Parasocial_Relationships_Affect_Indian_Gamers

Next
Next

Super Farming Boy takes a clever, charming, and creative twist on the farming sim genre we all cherish.