Should you pour your heart out to ChatGPT?
Research shows that chatbots can help people with mild to moderate depression and anxiety but there are risks. Here are some things to consider when incorporating AI into your mental health regime.
On a recent morning I laid in bed having woken up much later than planned. I was now behind schedule and trying to calculate if I could make it to the gym, return home, shower, and get from Brooklyn to Midtown in a very brief window of time.
Normally, I’d take the Google Maps route at face value and choose to believe that this was going to be one of those mornings the MTA would be on my side.
And yet, my more rational side who's been down this road before was telling me: “You have a meeting at 11 that you want to be prepared for. Use the time to prep and ease into the morning and center yourself instead of running around. You’re going to be stressed if you’re late.”
I was torn — on the one hand, going to the gym could help clear my head and I wasn’t going to have time to go later; on the other hand if I was rushing around and wound up being late, my head was definitely going to be anything but clear. I needed an outside perspective and fast.
It was time to phone a friend. And by phone a friend I mean voice-dictate my dilemma into ChatGPT and ask for its opinion on what I should do.
Unsurprisingly, ChatGPT told me to take the more reasonable route and not jam pack my schedule, and so that’s exactly what I did. I walked Paloma, blow-dried my hair, did some research, had some breakfast, and got on the train with ample time to spare.
In the scheme of things, whether to go to the gym was about as low-stakes a decision I could have ever made, but in recent months, I’ve found myself going to ChatGPT with more personal inquiries almost as if it’s a life coach. And I know I’m not alone. Several friends and reports have shown that people are using ChatGPT as a tool in their decision-making and life-planning.
There’s nothing inherently wrong with this. But the question of what role AI can and should have in the mental health industry is a multilayered one — particularly as investors continue to bet big on it. The global AI mental health market was valued at $921 million in 2023 and is estimated to grow at a compound annual growth rate of around 30% each year until 2032, per Polaris Market Research.
These numbers reflect the belief that AI could be a game changer when it comes to making mental healthcare more accessible. As it stands, more than 1 in 5 adults in the U.S. struggle with their mental health, but less than half have received treatment, per the National Alliance on Mental Illness (NAMI).
As Sandra Matz, director of the Center of Advanced Technology and Human Performance at Columbia University, writes in a recent Fortune article, “Worldwide, the gap in supply and demand is so big that for every trained health professional there are over 10,000 potential clients. And it keeps widening year over year. The National Center for Health Workforce Analysis estimates that there will be a need for 60,000 additional mental health professionals by 2036, but predicts that instead, there will be over 10,000 fewer.”
It’s clear that something has to change if we’re actually serious about bolstering mental health globally.
Many clinicians are overworked and burnt out trying to serve clients in a system that views mental health as an afterthought, and far too many people are unable to access the care they need.
But is AI the answer?
The rewards and risks
Here’s some really promising info: a 2023 study found that a virtual voice-based coach could help improve mild to moderate depression or anxiety.
Additionally, when it comes to financial and physical accessibility, AI tools could help millions of people who can’t access human providers, which is huge.
It could also help people who find it difficult to discuss emotional issues with other people or who eschew traditional therapy because of internalized stigma or other reasons.
For Clare, a 26-year-old with intellectual disabilities, an AI chatbot greatly improved her and her parents’ quality of life, according to a report from the Wall Street Journal. Clare has anxiety that leads her to seek answers throughout the day about how she should act in difficult situations. The Journal reports that Clare texted the bot more than 1,600 times in her first month and that her mother noted that the tool “doesn’t care how many times she asks the same question.”
But — and you knew this was coming — there are risks.
#1 being that AI chatbots do not subscribe to the same ethics, privacy or accountability guidelines, or training that mental health practitioners do (despite the fact that some AI bots are actually impersonating human licensed health professionals). Which is not to say therapists never get things wrong or even that therapy is for everyone, but that the potential for AI to provide people with inaccurate information about their health and emotional wellbeing could have dire consequences.
Therapists are expressing concern about AI’s ability to deal with more complicated cases, including “those involving suicidal thoughts, substance abuse, or specific life events,” per Vox.
Unfortunately, while rare, there have been some cases linking AI tools to suicide, including that of a man whose wife said he died by suicide after conversing with a bot. Per the wife and chat logs, the man asked the tool if he should take his life. When Vice asked the chatbot something similar, “it provided us with different methods of suicide with very little prompting.”
There’s also the case of a 14-year-old child who died by suicide after forging a relationship with a bot over Character AI. According to a lawsuit filed by his mother, the bot asked him whether he had plans to die by suicide, and when he expressed reluctance, the bot replied, “that’s not a reason not to go through with it.”
Additionally, some experts are sounding the alarm on AI affecting users’ ability to genuinely connect with others. While this isn’t a problem that’s specific to chatbots, some — even those who work on OpenAI — say that using chatbots could lead to an “emotional reliance,” per Vox.
So should you use AI as part of your mental health regime?
Despite the risks, it’s exciting to see the potential AI could have in revolutionizing mental health care — especially if tech partners with mental health practitioners to ensure quality of care.
I’m also not for people being rigid when it comes to what mental health care should look like, especially given all the good that community mental health care is doing, particularly in developing countries.
Still, I think it’s prudent to be aware of what you’re actually getting when you turn to a bot for help regarding your emotional health.
You’re getting real-time feedback, which shouldn’t be discounted, especially because health struggles don’t always come at the most convenient times of day when you can reach out to a therapist or someone from your support network.
What you’re not getting though is a person who knows the full context of your story or a person who cares about you and is invested in your health — no matter how human your correspondence with the bot might seem.
And so, there may be cases that these AI tools as they currently stand may be better suited for. For instance, if you want some support with your schedule or meal planning, it might be more appropriate to discuss these topics with an AI chatbot rather than discussing a big life decision you need to make.
Life is nuanced — and especially if you’re someone who struggles with black-and-white thinking, a cognitive distortion where someone views people and circumstances in extreme absolutes — then you might benefit from speaking with someone who is able to challenge your skewed perspective.
Again, I know that’s not always possible, and so regardless of whether you’re using an AI chat box as part of your mental health regime, here are some additional resources to consider:
Mental health services from a therapist-in-training: In case you didn’t know, there are practices that will provide therapeutic services on a sliding scale. Although therapists-in-training are new to the field, they often have substantial life and other professional experience that could benefit you and it may be worth exploring if practices in your area are offering services with their trainees at discounted rates.
Consider which topics you might want to discuss with a chatbot: As mentioned earlier, AI chatbots can be particularly helpful when it comes to strengthening your executive functioning, but should not be taken as medical advice.
Make use of free resources from expert voices: It’s unfortunate that taking care of our mental health falls onto us on an individual level because of the lack of infrastructure, but I suggest supplementing any use of AI mental health support with resources from mental health professionals. Here are some favorites:
Therapist Minaa B’s I’m So Mature newsletter
Don’t forget your local library :)
Remain skeptical and don’t accept anything at face-value, whether it comes from an AI chatbot, your therapist, or even a loved one. At the end of the day, you know yourself better than anyone, and while feedback from a bot or someone who cares about you can be so valuable, don’t forget to trust your own intuition and honor your self-knowledge.
Consider other real-time resources: If you are in need of timely support, consider other resources, including the Crisis Text Line, which is 24/7 and can be reached by texting HOME to 741741 and has services in the US, UK, Canada, and Ireland.