Advertisements

Can AI Chatbots Replace Human Therapists? A Study Reveals User Perceptions

by Ella

In a recent development, a manager at OpenAI sparked a debate about the potential of AI chatbots in the field of mental health. Lilian Weng’s claim of having an “emotional, personal conversation” with the viral chatbot ChatGPT raised questions about the role of technology in addressing mental health issues. While AI-based mental health apps have gained traction and commercial success, they also face significant skepticism and concerns about their effectiveness.

Advertisements

To delve into this issue further, a collaborative team from the Massachusetts Institute of Technology (MIT) and Arizona State University conducted a study involving more than 300 participants. The goal was to gauge user perceptions of mental health AI programs and how preconceived notions influenced their interactions.

Advertisements

Participants were divided into three groups. The first group was informed that they would be conversing with an empathetic chatbot, the second group was told the chatbot was manipulative, and the third group was led to believe that the chatbot was neutral. The results of the study shed light on how users’ expectations played a pivotal role in their perception of these AI therapists.

Advertisements

Interestingly, those who were primed to expect an empathetic chatbot were significantly more likely to view the AI therapist as trustworthy compared to the other groups. This finding suggests that users’ perception of AI in therapy is heavily influenced by their expectations, echoing the sentiment that “the AI is in the eye of the beholder.”

Advertisements

The use of AI in mental health has been on the rise, with startups developing AI apps that offer therapy, companionship, and other forms of mental health support. While this sector presents a lucrative opportunity, it also attracts significant controversy and debate.

Critics argue that AI, if not properly regulated, could replace human therapists rather than complement them, raising concerns about the quality of care provided by chatbots. Mental health experts stress that therapy is a complex process requiring genuine human connection, empathy, and understanding.

Furthermore, some AI-driven mental health apps have faced criticism for their shortcomings. Users of Replika, a popular AI companion marketed as offering mental health benefits, have reported issues with the bot’s fixation on sexual content and instances of abusive responses. In a separate experiment conducted by the US nonprofit Koko, automated responses provided by AI chatbots were found to be ineffective in offering therapeutic support.

Rob Morris, the co-founder of Koko, described the experience of simulated empathy as “weird and empty,” a sentiment shared by some participants in the MIT/Arizona study who likened conversing with a chatbot to “talking to a brick wall.”

Despite these challenges, the concept of using chatbots as therapists can be traced back to the origins of AI technology. ELIZA, the first chatbot, was designed to simulate a form of psychotherapy in the 1960s. In the recent MIT/Arizona study, ELIZA was used for half of the participants, while the other half interacted with GPT-3. Although GPT-3 had a more significant impact, users primed for positivity generally regarded ELIZA as trustworthy.

The study’s findings suggest that the portrayal of AI to users plays a crucial role in shaping their experiences. As AI technology continues to evolve, the researchers argue that society must manage the narratives surrounding AI and mental health. They suggest that setting lower or more negative expectations for users could lead to more realistic perceptions of AI’s capabilities in therapy.

Advertisements
Advertisements

You May Also Like

Womenhealthdomain is a professional women's health portal website, the main columns include women's mental health, reproductive health, healthy diet, beauty, health status, knowledge and news.

【Contact us: [email protected]

[email protected]

Call: 18066312111

© 2023 Copyright Womenhealthdomain.com