Everywhere you turn these days, it seems ChatGPT is there, too. People are turning to artificial intelligence tools for just about everything, from helping write school essays to drafting their grocery lists. Some are even forming relationships with AI.
AI and Your Mental Health
Can AI really tackle your mental health challenges? The short answer is yes. The long answer, though, is more complicated.
With a few simple clicks, any time of day or night, ChatGPT can offer therapy-like advice.
Where Does AI Get Its Mental Health Advice?
The vast majority of LLMs (Large Language Models, like ChatGPT, Gemini, or Claude) are trained on text and other data scraped from publicly available Internet resources.
Basically, these LLMs quickly read all the information posted on the Internet and provide you with a consensus of its findings.
Are there aspects of mental health where AI can be good?
AI can help explain therapy and coping tool concepts like cognitive distortions, anxiety cycles, mindfulness exercises, reframing, journaling prompts, etc.
ChatGPT could also be used to explore ways to utilize and practice tools in situations outside of receiving mental health care.
Are there risks to turning to ChatGPT or other LLMs versus talking with a mental health professional?
Yes, there are risks. The LLM could misinterpret emotional context or miss warning signs that a trained professional is taught to assess for and treat. Advice provided may also be oversimplified. It may not be appropriate because the LLM is not able to fully assess and observe body language, tone, or subtle cues.
When you meet with a mental health professional in person or via telehealth, you build a real rapport. That connection can build and strengthen an interpersonal connection, leading to much better treatment plans.
Is it true that some people have formed close relationships with AI? Does this pose a risk?
Yes, there are reports of people who have formed close relationships with AI.
The risk is simple: AI is not able to accurately diagnose an individual and is not professionally trained in helping individuals through different challenges they face.
Since AI is not able to gather a full picture and assessment, it is unable to offer a correct treatment plan and could cause harm. Too much dependence on AI can also lead to social withdrawal and disconnection from others. Independency and making individual decisions could also be affected.
In a crisis situation, LLMs are unable to assess safety or intervene.
When should you seek out the help of a physical professional?
While AI can be a helpful tool, there are absolutely times where you should seek care from an actual human provider:
- You’re dealing with anything that affects your safety, functioning, long-term wellbeing, or mental stability
- You cannot manage daily responsibilities (self-care habits, work duties, school attendance and assignments, etc.)
- You are experiencing trauma symptoms (flashbacks, hyperarousal, avoidance), depression, grief, anxiety, or panic attacks that are interfering with life
- Experiencing any psychosis symptoms (hallucinations, delusions)
- Experiencing any thoughts of self-harm or suicidal or homicidal thoughts.
LLMs can be helpful, but proceed with caution. Use the tool for initial inquiries. Then turn to mental health professionals, like those on our team, to come up with treatment plans.
How do I make an appointment with a human?
Our team of providers can see you in a matter of days, sometimes even the same day. We serve patients throughout North Dakota, Minnesota, South Dakota, Montana, Utah, and Alaska.
If you are ready to book an appointment, click here to get started, or give us a call at 701-205-3000. Our schedulers will work to pair you with a provider that’s especially suited to meet your needs.

