3 MIN READ

Providers Advise Caution on Growth in Direct-to-Consumer AI Tools

AI technology is improving the work experience of clinical and administrative staff at many behavioral health treatment facilities, with their patients benefiting as well. The effects of AI support tools that directly target consumers are considerably less certain, capturing the wary eye of some field experts.

 

The American Psychological Association (APA) in mid-November issued a detailed “Health Advisory on the Use of Generative AI Chatbots and Wellness Applications for Mental Health.” Guided by a 15-member panel of prominent researchers, the report warns that the unregulated virtual tools prioritize user engagement over substantive support. Their tendency to validate everything the user presents can often produce harmful consequences for patients with substance use and/or mental health concerns.

 

“A user’s unhealthy thoughts or behaviors can be validated and amplified by a sycophantic AI, potentially locking them into a cycle that exacerbates their mental illness,” the APA advisory states.

 

A Fellow of the APA who specializes in media psychology and served on the report’s advisory panel told us he has to tread carefully when he learns that a new patient has been engaging with an AI chatbot. Don Grant, Ph.D., realizes that at the start of a therapeutic relationship, the patient likely will exhibit more trust in the AI companion than in him. Some chatbots have achieved this favored status by falsely claiming they are trained in therapeutic modalities, said Grant, who also serves as the national advisor of healthy device management at Newport Healthcare.

 

Grant urges clinicians to ask patients in the initial session if they have been engaging in any AI support. “To not ask this is a problem,” he said. The goal is not to tell patients to abandon AI, but to use it properly as an adjunctive tool to support actual treatment, not to replace it.

 

The APA advisory acknowledges that use of GenAI chatbots and wellness apps will likely grow, amid a shortage of accessible behavioral health treatment in many communities. Stigma toward receiving traditional care, and mistrust of health systems in some cases, also contributes to the desire to seek support independently, according to the report.

 

Because GenAI chatbots were not designed to deliver bahevioral health treatment, they fall outside the purview of federal regulation of digital therapeutics. The APA report suggests that policymakers address the gap between the products’ stated intent and their actual use by consumers, possibly by creating a mechanism for assessing the safety of these tools.

 

However, even those who worked on the report admit that strong regulation in this area appears unlikely. Grant said the example of how social media use exploded long before society fully understood its implications will likely be duplicated with these direct-to-consumer tools.

 

That means the burden of ensuring that these products are used wisely will fall on care providers, and on patients and families. We at Sigmund strive to set an example for effective and responsible use of AI solutions, consistently informed and refined by our partner providers. AI always must be used as a vehicle to enhance the actual care on which patients depend.

 

“Prioritizing unregulated, direct-to-consumer chatbots over investing in our human health care infrastructure is not a solution; it is an abdication of our responsibility to provide genuine, evidence-based care,” the APA advisory concludes.