As an AI Language Model, I Cannot Be Your Therapist

by Emerson Lee | Thursday, Jan 18, 2024
aitherapy

I use ChatGPT a lot - so of course, I've used it as a life coach (Barbie as my life coach to be exact). But today I decided to mix it up with a prompt I found on Reddit; it was a long, detailed one that started with “Behave as a cognitive/dialectical behavior/humanist/psychoanalyst therapist...”

 

Now, I know what you’re thinking: AI is not a substitute for therapy! I know, I know. And I agree. ChatGPT does too, because if you don’t ask it to be your therapist in the right way, it rejects you and tells you to go and get your stinky little brain a real therapist. But the future is barreling at us faster than we can keep up, and no one knows the role AI will play as it gets smarter and more eerie by the day. There has been a moral panic over just about every technological innovation. But, we’ve grown accustomed to every one of them so far, so what if AI’s role in mental health is something we may have to accept in the not-so-distant future?

There are benefits to AI-assisted therapy. Therapy is extremely expensive, and the free availability of AI lowers the barrier of entry for those who need care but otherwise would have trouble accessing it. The need for only an internet connection also takes away the inconvenience of needing to drive to a therapist—why go all that way when a more accessible therapist is at your fingertips? Those with depression who can't muster the energy to get out of bed, let alone go to therapy, could receive help with a tap. Those with social anxiety who fear interacting with humans could be helped by the true non-judgemental nature of a robot. This also goes for those who feel therapy is embarrassing and wouldn’t otherwise see a human therapist—some help is better than none.

 

As wonderful as never interacting with a human being again sounds, there are concerns about not having to leave the house. Phobias are treated through exposure therapy (gradual exposure to the feared object) and depression through behavioral activation (engaging in enjoyable or challenging activities to increase the chance of feeling positive) – both require going outside and going out of one’s comfort zone. The availability of AI chatbots could possibly discourage people from getting the human interaction they need to improve their quality of life. These chatbots may give them the false idea that they are getting adequate treatment from their digital therapist.

 

Despite this, therapy chatbots could realistically help with psychoeducation, a component for many therapies that do not necessarily require human connection or a high level of nuance. It could help educate those in therapy on the techniques they are using, and on mental health in general. Chatbots can then answer any questions they may have. An advanced AI could, in theory, help with changing cognitive patterns such as in CBT, especially for milder, short-term issues over chronic ones. But of course, the lack of human connection has issues: chatbots cannot detect body language, tone, and similar nuances, even if it is highly intelligent. Additionally, a component of humanist therapy is the experience of being accepted unconditionally by another human. The chatbot will likely not help with deeper self-esteem issues because a robot cannot mimic human acceptance exactly.

 

A final concern is privacy. There have been therapy chatbots (such as Replika and Woebot) that share personal information with third parties. Most people do not consider online privacy and are “okay” with giving out very personal information such as addresses along with personal life issues to AI without a second thought. Licensed therapists on the other hand are required to take confidentiality very seriously. This raises ethical concerns with the use of online AI therapy.

 

To write this blog post, I asked my new ChatGPT therapist to help me out with my writing anxiety. It asked me a bit about my emotions then, to my surprise, asked me what the essay was about and led me through writing an outline. This helped me overcome my initial anxiety and I was good to go after that...until I wasn’t and I needed to use what I learned in actual therapy to help me along. But it seemed to know precisely what I needed at that moment, considering AI is only in its nascent stages, that’s scary.

 

Some are concerned it could replace therapists, and, though I'm inclined to say people would never choose AI over therapists, I think we just don't know what the future holds. The world is developing too fast to tell. I don’t think we should view how AI develops from now on as something beyond our control, though. Technology is ultimately a tool that’s as evil or angelic as the humans that harness it. If we can use AI in the right ways to assist with therapy, we could lower costs for people and reach more of those in need. On the flip side, if we rely too much on technology and neglect our flesh suits, there could be a decrease in mental health as people begin to use AI for emotional support (heard of AI girlfriends?) rather than being forced to reach out of their comfort zone towards their fellow humans. I encourage you to try out ChatGPT to help brainstorm creative solutions to problems. Use it to your advantage if you have not already. But don’t become a cyborg, and remember the wise words of ChatGPT: “As an AI language model, I cannot be your therapist”.