By: Camden Baucke MS LLP
Artificial Intelligence (IA) is a hot topic lately, especially within the mental health sphere.
Many people are seeking answers to their psychological questions with AI Chatbots.
It’s a nuanced conversation because there may be benefits to using AI for mental health. However, the same goes for the potential negative outcomes as well.
To fully understand the context of AI therapy, we need to look at the pros and cons.
How is AI Used for Mental Health?
Two of the most popular ways AI is used for mental health is (1) diagnoses & (2) quick fixes.
This means you might use ChatGPT or Google AI to look up possible symptoms of a diagnosis or actions you can take to address mental health issues.
While these are the most popular items plugged into AI chatbots, it doesn’t include those who might use them for answering deeper psychological issues. This is when quick fixes delve into areas of identity, self-esteem, and purpose.
The Pros of Using AI for Mental Health

#1 Affordability
This is the most sensible reason why people may use AI chatbots. Therapy can be expensive, especially if you don’t have health insurance.
Affordable solutions to mental health issues are crucial in today’s economy.
While it’s nice that mental health advice is free, it does matter what you are searching for.
#2 Quick Tips
If you aren’t struggling to function or have any issues with quality of life, then AI chatbots could be just what you need. You might not need any long term investment because you don’t have long term issues that need addressing.
Usually, this includes calming exercises, meditations, exercise, recent studies, and so on.
With AI chatbots, you can learn about how to boost your mental health in day-to-day experiences.
#3 Possible Diagnoses
The purpose of a diagnosis is treatment.
If you feel like you might have a certain disorder or condition, AI chatbots can help you uncover what that is.
However, suggestions for a diagnosis should only lead to professionals who are trained to diagnose.
If you type symptoms into an AI chatbot and receive a potential diagnosis, don’t stop there. The problem lies in assuming a trained chatbot is a replacement for a trained professional.
The Cons of Using AI for Mental Health

#1 Lack of Challenge
The whole point of therapy is not to agree with what you think or how you have been living.
Much of therapy is challenging things you might consider “normal”. If you don’t think something is abnormal, you might not type it into a chatbot and it never gets addressed.
There’s even a widely used intervention called “cognitive challenging” as part of cognitive behavioral therapy.
Chatbots, like ChatGPT, are notoriously validating in a psychologically unhelpful way. In therapy, we validate emotions, but we don’t always validate thoughts because not all thoughts are true.
#2 Lack of Depth
AI Chatbots are fine for quick fixes, but nothing that’s more intense.
Even conditions such as generalized anxiety or major depressive disorder can not be treated with AI chatbots. There’s no structure or evidence to support treatment of disorders with AI chatbot only.
#3 Lack of Accountability
AI Chatbots can always be there for you, yet they have no investment in your well-being.
For therapists, we not only treat people, but we are responsible for the individuals we treat.
Hence, there are steps to becoming a mental health professional. There’s a licensing board so that if a therapist acts unethically or provides hurtful information, a client can be compensated and/or protected by a larger governing body.
As for AI Chatbots, there’s no protection. You can try to sue a company of that magnitude, but unless it’s a prevalent issue affecting thousands or millions of people, you’re likely unprotected.
Why Social Avoidance Drives This Debate
Let’s be honest, there’s a larger issue at work here.
Research on the effectiveness of CBT is pretty straightforward with one big caveat.
The effectiveness of therapy depends on the effectiveness of therapists.
Unfortunately, I’ve heard plenty of horror stories about the misconduct of therapists.
I also know plenty of therapists who are incredibly good at their jobs as well, but those who have been hurt or hindered by therapists are likely to turn to Chatbots for help.
While understandable, your current needs are more important than the previous incompetence of others.
This “real issue” is a combination of problematic therapists and avoidance of common social dilemmas. If someone “hates people” and holds a generally negative view of others and the world, they might choose an emotionless AI chatbot that does not judge them or treat them poorly.
All of these issues are addressed by therapy. An avoidance of social interactions is typically a sign of social anxiety, a diagnosis that needs treatment. A hatred of others and the world is often much more than pessimism, and often reflects a painful and jaded set of beliefs. These beliefs are cognitive constructs also addressed by several forms of therapy.
Most importantly, fear that one will be judged by a therapist is understandable, but is also addressed by therapy. Many therapists practice an approach popularized by Carl Rogers called Unconditional Positive Regard.
This is a self-set mandate to treat every client with a respectful and radically nonjudgmental approach regardless of one’s presentation, beliefs, thoughts, history, or anything else.
If this is not practiced by a therapist, it’s imperative for your mental health to hold accurate beliefs. An anecdotal experience or expectation of judgement does not condemn therapists or therapy as a whole. While it might make you feel safer to assume everyone is out to get you, it also promotes an unhealthy isolation that AI chatbots can sustain.
Final Thoughts

AI chatbots have taken the world by storm, but it’s important to know how to use them.
Beyond ethical concerns regarding water consumption for cooling servers, there are additional complications for using AI chatbots to treat psychological conditions.
AI chatbots can be useful for quick tips and changing habits as well as providing affordable advice for everyday issues. They can even assist one with recognizing a possible issue that can be further solidified with a mental health professional.
The cons of AI chatbots are that they lack the challenge which every client needs to uncover hidden issues within their normative thinking as well as lacking the psychological depth needed to effectively treat serious conditions. Additionally, there are little to no protections or responsibility for someone who uses AI chatbots.
Even with the pros and cons, the primary issue, beyond affordability, appears to be the distrust of others.
Using AI chatbots for mental health is an attempt to experience therapy without a therapist.
If you are avoiding fees or the potential judgement of therapists, yet ignoring your own mental needs, AI chatbots won’t be the answer.
Take these pros and cons into consideration and make the best decision for yourself. If you need help with small things, maybe an AI chatbot can help. However, if these small things pile up and appear to be stemming from the same thing, then maybe it’s time for therapy.


Leave a comment