Key points
- Expert notes AI applications “have a lot to promise”
- Expresses concerns about potential harm to youth
- US Food and Drug Administration does not certify medical devices or AI apps
NEW YORK: Researchers at Dartmouth College believe artificial intelligence can deliver reliable psychotherapy, distinguishing their work from the unproven and sometimes dubious mental health apps flooding today’s market.
Their application, Therabot, addresses the critical shortage of mental health professionals.
According to Nick Jacobson, an assistant professor of data science and psychiatry at Dartmouth, even multiplying the current number of therapists tenfold would leave too few to meet demand.
“We need something different to meet this large need,” Jacobson told AFP.
The Dartmouth team recently published a clinical study demonstrating Therabot’s effectiveness in helping people with anxiety, depression and eating disorders.
A new trial
A new trial is planned to compare Therabot’s results with conventional therapies.
The medical establishment appears receptive to such innovation.
Vaile Wright, senior director of health care innovation at the American Psychological Association (APA), described “a future where you will have an AI-generated chatbot rooted in science that is co-created by experts and developed for the purpose of addressing mental health.”
Wright noted these applications “have a lot of promise, particularly if they are done responsibly and ethically,” though she expressed concerns about potential harm to younger users.
Jacobson’s team has so far dedicated close to six years to developing Therabot, with safety and effectiveness as primary goals.
Michael Heinz, psychiatrist and project co-leader, believes rushing for profit would compromise safety.
The Dartmouth team is prioritizing understanding how their digital therapist works and establishing trust.
They are also contemplating the creation of a nonprofit entity linked to Therabot to make digital therapy accessible to those who cannot afford conventional in-person help.
Care or cash?
With the cautious approach of its developers, Therabot could potentially be a standout in a marketplace of untested apps that claim to address loneliness, sadness and other issues.
According to Wright, many apps appear designed more to capture attention and generate revenue than improve mental health.
Such models keep people engaged by telling them what they want to hear, but young users often lack the savvy to realise they are being manipulated.
Darlene King, chair of the American Psychiatric Association’s committee on mental health technology, acknowledged AI’s potential for addressing mental health challenges but emphasises the need for more information before determining true benefits and risks.
“A lot of questions”
“There are still a lot of questions,” King noted.
To minimise unexpected outcomes, the Therabot team went beyond mining therapy transcripts and training videos to fuel its AI app by manually creating simulated patient-caregiver conversations.
While the US Food and Drug Administration theoretically is responsible for regulating online mental health treatment, it does not certify medical devices or AI apps.
Instead, “the FDA may authorise their marketing after reviewing the appropriate pre-market submission,” according to an agency spokesperson.
The FDA acknowledged that “digital mental health therapies have the potential to improve patient access to behavioural therapies.”