Why AI Chatbots Can’t Replace Real Therapy: Lessons from Stanford Research

Why AI Chatbots Can’t Replace Real Therapy: Lessons from Stanford Research

At Stepping Stones Wellness Center, we are committed to providing safe, compassionate, and evidence-based mental health care. As new technologies like artificial intelligence become more common in therapeutic spaces, it is important to evaluate both their potential and their risks. One development gaining attention is the use of AI-powered chatbots that simulate therapy sessions. While these tools may appear convenient and accessible, recent research from Stanford University urges caution and highlights the serious limitations of relying on them for mental health support.

A study conducted by the Stanford Institute for Human-Centered Artificial Intelligence found that many therapy chatbots are not equipped to respond safely to people experiencing mental health crises. Some bots reinforced delusional thinking, failed to recognize suicidal thoughts, and used stigmatizing language about certain conditions like schizophrenia and substance use. In one troubling example, a chatbot offered a list of bridges to a user expressing suicidal ideation rather than referring them to crisis support. These findings demonstrate that AI systems lack the clinical awareness necessary for safe and ethical care.

The heart of effective therapy lies in the human connection. Licensed therapists provide empathy, insight, and a safe space for clients to explore difficult emotions and experiences. They are trained to recognize warning signs, challenge harmful thoughts, and adapt their approach to each individual’s needs. While AI can simulate conversation, it cannot replicate the depth of understanding and relational support that trained professionals offer.

The Stanford study also raised concerns about bias and privacy. Many AI tools are trained on limited data sets, which can result in responses that are inappropriate or discriminatory toward individuals from diverse backgrounds. There is also little transparency about how personal information is collected, stored, or used. At Stepping Stones Wellness Center, we follow strict ethical and legal standards to ensure that all client information remains confidential and secure.

While we remain open to the ways technology can support mental health care, we believe it must be used responsibly and always under professional supervision. AI tools may assist with tasks like scheduling or tracking progress between sessions, but they are not a substitute for therapy guided by a licensed clinician. At Stepping Stones, we will continue to prioritize thoughtful, human-centered care to help you move forward with clarity and confidence.


Reference
Moore, J., Haber, N., Klyman, K., et al. (2024, June 11). Exploring the dangers of AI in mental health care. Stanford Institute for Human-Centered Artificial Intelligence. https://hai.stanford.edu/news/exploring-dangers-ai-mental-health-care

Why AI Chatbots Can’t Replace Real Therapy: Lessons from Stanford Research

At Stepping Stones Wellness Center, we are committed to providing safe, compassionate, and evidence-based mental health care. As new technologies like artificial intelligence become more common in therapeutic spaces, it is important to evaluate both their potential and their risks. One development gaining attention is the use of AI-powered chatbots that simulate therapy sessions. While these tools may appear convenient and accessible, recent research from Stanford University urges caution and highlights the serious limitations of relying on them for mental health support.

A study conducted by the Stanford Institute for Human-Centered Artificial Intelligence found that many therapy chatbots are not equipped to respond safely to people experiencing mental health crises. Some bots reinforced delusional thinking, failed to recognize suicidal thoughts, and used stigmatizing language about certain conditions like schizophrenia and substance use. In one troubling example, a chatbot offered a list of bridges to a user expressing suicidal ideation rather than referring them to crisis support. These findings demonstrate that AI systems lack the clinical awareness necessary for safe and ethical care.

The heart of effective therapy lies in the human connection. Licensed therapists provide empathy, insight, and a safe space for clients to explore difficult emotions and experiences. They are trained to recognize warning signs, challenge harmful thoughts, and adapt their approach to each individual’s needs. While AI can simulate conversation, it cannot replicate the depth of understanding and relational support that trained professionals offer.

The Stanford study also raised concerns about bias and privacy. Many AI tools are trained on limited data sets, which can result in responses that are inappropriate or discriminatory toward individuals from diverse backgrounds. There is also little transparency about how personal information is collected, stored, or used. At Stepping Stones Wellness Center, we follow strict ethical and legal standards to ensure that all client information remains confidential and secure.

While we remain open to the ways technology can support mental health care, we believe it must be used responsibly and always under professional supervision. AI tools may assist with tasks like scheduling or tracking progress between sessions, but they are not a substitute for therapy guided by a licensed clinician. At Stepping Stones, we will continue to prioritize thoughtful, human-centered care to help you move forward with clarity and confidence.


Reference
Moore, J., Haber, N., Klyman, K., et al. (2024, June 11). Exploring the dangers of AI in mental health care. Stanford Institute for Human-Centered Artificial Intelligence. https://hai.stanford.edu/news/exploring-dangers-ai-mental-health-care

Office Hours

Monday  

9:00 am - 9:00 pm

Tuesday  

9:00 am - 9:00 pm

Wednesday  

9:00 am - 9:00 pm

Thursday  

9:00 am - 9:00 pm

Friday  

9:00 am - 9:00 pm

Saturday  

9:00 am - 9:00 pm

Sunday  

Closed