3 things to know about using ChatGPT like a therapist

Freddie Chipres couldn’t shake the melancholy that lurked at the edges of his otherwise “blessed” life. He occasionally felt lonely, especially when working from home. The married 31-year-old mortgage broker wondered if something was wrong: Could he be depressed?
Chipres knew friends who had had positive experiences with a therapist. He was more open to the idea than ever, but it would also mean finding someone and making a date. Really, he just wanted a little feedback on his sanity.
At this point, Chipres turned to ChatGPT(Opens in a new window), an artificial intelligence chatbot that responds surprisingly chatty. After the last iteration of the chatbot, launched in December, he watched some YouTube videos that suggested that ChatGPT could be useful not only for things like writing professional letters and researching various topics, but also for the treatment of psychological problems.
ChatGPT was not designed for this purpose, which raises questions about what happens when people make it an ad hoc therapist. While the chatbot is knowledgeable about mental health and can respond with empathy, it cannot diagnose users with a specific mental health condition, nor can it provide reliable and accurate treatment details. In fact, some mental health experts are concerned that people seeking help from ChatGPT may be disappointed, misled, or compromise their privacy by confiding in the chatbot.
6 Creepy Things ChatGPT Has Been Used For
OpenAI, the company that hosts ChatGPT, declined to respond to Mashable’s specific questions about these concerns. A spokesperson noted that ChatGPT has been trained to reject inappropriate requests and block certain types of unsafe and sensitive content.
In Chipres’ experience, the chatbot has never provided inappropriate responses to its messages. Instead, he found ChatGPT refreshingly helpful. Initially, Chipres googled different styles of therapy and decided that he would benefit the most from cognitive behavioral therapy(Opens in a new window) (CBT), which typically focuses on identifying and reformulating negative thought patterns. He challenged ChatGPT to respond to his questions like a CBT therapist would. The chatbot obliges, but with a reminder, to seek professional help.
Chipres was amazed at how quickly the chatbot gave him good, practical advice, like taking a walk to boost his mood, practicing gratitude, doing an activity he enjoyed, and resting through meditation and slow, deep breathing to find. The advice boiled down to memories of things he had missed; ChatGPT helped Chipres resume his dormant meditation practice.
He appreciated that ChatGPT didn’t bombard him with ads and affiliate links like many of the mental health websites he encountered. Chipres also liked that it was convenient and that it simulated talking to another human, which clearly set it apart from scouring the internet for mental health advice.
“It’s like I’m talking to someone. We go back and forth,” he says, temporarily and accidentally calling ChatGPT a person. “This thing listens, pays attention to what I say… and gives me answers based on it.”
Chipres’ experience might sound appealing to people who can’t or don’t want to have access to professional counseling or therapy, but mental health experts say they should consult ChatGPT with caution. Here are three things you should know before trying to use the chatbot to talk about mental health.
1. ChatGPT is not designed to act as a therapist and cannot diagnose you.
While ChatGPT can produce a ton of text, engaging with a therapist isn’t quite the art yet. dr Adam S. Miner, a clinical psychologist and epidemiologist who studies artificial conversational intelligence, says therapists often concede when they don’t know the answer to a client’s question, in contrast to a seemingly omniscient chatbot.
This therapeutic practice aims to help the client to reflect on their circumstances in order to develop their own insights. However, a chatbot that wasn’t designed for therapy doesn’t necessarily have this ability, says Miner, clinical assistant professor of psychiatry and behavioral sciences at Stanford University.
Importantly, Miner notes that while therapists are legally prohibited from sharing client information, individuals using ChatGPT as a sounding board lack the same level of privacy.
“We have to be realistic in our expectations when we’re dealing with amazingly powerful and impressive language engines, but they’re still software programs that are imperfect and trained on data that aren’t appropriate for every situation,” he says. “This is especially true for sensitive conversations about mental health or experiences of stress.”
dr Elena Mikalsen, director of pediatric psychology at Children’s Hospital of San Antonio, recently attempted to query ChatGPT using the same questions she receives from patients every week. Whenever Mikalsen tried to elicit a diagnosis from the chatbot, it would be dismissed and instead recommended professional care.
This is probably good news. After all, a diagnosis ideally comes from an expert who can make that call based on a person’s specific medical history and experiences. At the same time, Mikalsen says people hoping for a diagnosis may not realize that there are numerous clinically validated screening tools available online(Opens in a new window).
For example, a mobile Google search for “clinical depression” immediately leads to a screener(Opens in a new window) known as PHQ-9, which can help determine a person’s level of depression. A healthcare professional can review these results and help the person decide what to do next. ChatGPT provides contact information for the 988 Suicide and Crisis Lifeline(Opens in a new window) and crisis text line(Opens in a new window) If suicidal thoughts are directly referenced, the language the chatbot says may violate its content policy.
2. ChatGPT may be knowledgeable about mental health, but it’s not always comprehensive or accurate.
When Mikalsen used ChatGPT, she noticed how the chatbot sometimes provided inaccurate information. (Others have criticized ChatGPT’s responses as they were presented with disarming confidence.) It focused on medication when Mikalsen asked about treating childhood OCD, but the clinical guidelines clearly state that(Opens in a new window) that some form of cognitive behavioral therapy is the gold standard.
Mikalsen also noted that a response to postpartum depression was not related to more severe forms of the condition, such as postpartum anxiety and psychosis. In comparison, a MayoClinic explainer on the subject included this information and provided links to mental health hotlines.
It’s unclear if ChatGPT was trained on clinical information and official treatment guidelines, but Mikalsen likened many of his conversations to searching Wikipedia. The generic, short paragraphs of information left Mikalsen feeling that it should not be a trusted source for mental health information.
“That’s my overall criticism,” she says. “It provides even less information than Google.”
3. There are alternatives to using ChatGPT for mental health.
dr Elizabeth A. Carpenter-Song, a medical anthropologist who studies mental health, said in an email that it’s completely understandable why people are turning to a technology like ChatGPT. Her research has found that people are particularly interested in the constant availability of digital mental health tools, which she says is like having a therapist in the bag.
“Technology, including things like ChatGPT, appears to offer an accessible way to access answers and potentially support mental health,” wrote Carpenter-Song, a research associate professor in the Department of Anthropology at Dartmouth College. “But we must remain cautious about any approach to complex problems that looks like a ‘silver bullet’.”
“We must remain cautious about any approach to complex problems that looks like a ‘silver bullet’.”
Carpenter-Song noted that research suggests that digital mental health tools are best used as part of a “care spectrum.”
Those looking for more digital support in a conversational context similar to ChatGPT might consider chatbots specifically designed for mental health, like Woebot(Opens in a new window) and Wysa(Opens in a new window)who offer AI-guided therapy for a fee.
Digital peer support services are also available to people looking for encouragement online, connecting them with listeners who are ideally willing to offer it in an empathetic and nonjudgmental manner. Some, like Wisdo(Opens in a new window) and circles(Opens in a new window)require a fee, while others, like TalkLife(Opens in a new window) and Coco(Opens in a new window), are free. However, these apps and platforms reach far and are also not intended to treat mental illness.
More generally, Carpenter-Song believes that digital tools should be coupled with other forms of support such as mental health care, housing and employment “to ensure people have opportunities for meaningful recovery.”
“We need to understand more about how these tools can be useful, under what circumstances and for whom, and remain vigilant to uncover their limitations and potential harms,” Carpenter-Song wrote.
If you are having thoughts of suicide or are going through a mental crisis, please speak to someone. You can reach the 988 Suicide and Crisis Lifeline at 988; the Trans Lifeline at 877-565-8860; or the Trevor Project at 866-488-7386. Text “START” to the crisis text line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday, 10:00 a.m. to 10:00 p.m. ET, or by email [email protected]. If you don’t like the phone, consider using the 988 Suicide and Crisis Lifeline Chat at crisischat.org(Opens in a new window). Here is a List of international sources(Opens in a new window).
https://mashable.com/article/how-to-chat-with-chatgpt-mental-health-therapy 3 things to know about using ChatGPT like a therapist