The Dangers of Using AI for Mental Health Instead of a Licensed Professional
By Kevin Lahey, LSW
Times are changing! I think we’re all well aware of that at this point. However, the advancement and implementation of AI into our everyday lives seems to be bringing about a lot of new issues. Don’t get me wrong – there’s a lot that AI could provide for us. Automating repetitive tasks, or aiding in data analysis, for example, are things that AI definitely helps in.
With that being said, AI, like any relatively new or exciting advancement is being implemented into many other facets of our lives where, quite frankly, it can do a lot more harm than good. I could go into the various fields and jobs being impacted negatively by the implementation of AI but, as the title says, this is about AI being used in mental health. Yes; AI may provide some things like 24/7 availability or even cheaper/non-existent pricing than traditional therapy, but based on the available research, that seems to be about it, at least currently.
When it comes to mental health treatment, I firmly believe that one of the most important aspects is the therapeutic rapport/human connection and AI just cannot provide that. Sure – many chatbots can mimic being human, but I think we can all agree that it’s not the same. The human-to-human connection allows for an interesting balance of safety and vulnerability to take place. In order for us to grow as people, we need to learn to be comfortable with uncomfortable/new situations and thoughts. No matter how good the chatbot may be, the AI therapist is not going to allow for that same level of vulnerability to take place as ultimately, we know it’s not a real person on the other end.
It’s also important to acknowledge how AI chatbots can very easily play into harmful stigma surrounding mental health, or even help enable dangerous and destructive thought patterns. In a study done by Stanford (2025), researchers found that these therapy chatbots would often harshly stigmatize certain mental health problems like substance abuse or even schizophrenia, which could very well lead to people feeling even worse or turning their backs on therapy altogether. Additionally, the study found that these chatbots would often miss out on a lot of the nuance of conversation that could lead to dangerous results. For example, even when prompted to be as professional as possible, therapy chatbots would often miss clear signs of suicidal ideation or planning, even going so far as to provide information to the client that could very much lead to them harming themselves.
I’d like to believe that, at some point down the line, AI could be used to handle things like billing so therapists could focus on the treatment side of things, but the current way it’s being implemented seems to be capable of doing much more harm than good. Ultimately, the goal of therapy for many people comes down to feeling better about oneself while simultaneously nurturing one’s ability to have positive human connections. In the end, I think it’s fair to say that improving people’s ability to connect to other people needs to happen via work with another person, and AI just can’t provide that.
I know AI might be the easier route to take for a lot of people, but if you’re thinking about trying therapy, I’d really suggest trying out the tried and true human-to-human version first. Here at Owens & Associates, we’ve got a wealth of expertise to work with and help people from all walks of life and various problems.
Give us a call 📞 847-854-4333 or an email 📩admin@owenscounseling.com to get the ball rolling on your treatment today.
Remember, we all deserve a helping hand, and we’re happy to provide them.
All of us at Owens Counseling & Therapy are here for YOU! In person or telehealth – which ever you prefer. We want to listen to understanding, provide a safe, non-judgemental space for you, and support you in figuring out life. YOUR life, as in how you want to live it.
Contact us with today to begin your new life journey!
Here’s that Stanford article, should you be interested in reading it yourself!