In response to a growing mental health crisis and a shortage of professionals, researchers at Dartmouth College are working to establish artificial intelligence as a credible tool for delivering psychotherapy. Their AI-powered application, Therabot, has shown promise in treating conditions such as anxiety, depression, and eating disorders, according to a recent clinical study.
Assistant Professor Nick Jacobson, who specializes in data science and psychiatry, emphasized the urgent need for alternative solutions. “Even if we multiplied the number of therapists by ten, it still wouldn’t meet the demand,” he explained. “We need something fundamentally different.”
The Dartmouth team is planning a new clinical trial comparing Therabot‘s effectiveness to traditional therapy. Their cautious, research-driven approach stands in stark contrast to the influx of unregulated and often questionable mental health apps currently dominating the digital market.
Vaile Wright, director of healthcare innovation at the American Psychological Association, envisions a future in which scientifically-grounded AI chatbots, developed in collaboration with mental health experts, play a vital role in emotional support. However, she warned about the risks to younger users, who may not recognize manipulative design elements in profit-driven apps.
Jacobson’s team has spent nearly six years refining Therabot, prioritizing safety and efficacy over quick commercialization. Co-leader Dr. Michael Heinz stressed that monetization should never come at the cost of safety. The researchers are even considering launching a nonprofit to ensure access for those unable to afford traditional therapy.
While many existing apps are engineered for user engagement and revenue, often telling users what they want to hear, Therabot aims to replicate evidence-based therapeutic interactions. Instead of relying solely on training data from therapy transcripts, the team created simulated patient-therapist dialogues to enhance its reliability.
Although the US Food and Drug Administration (FDA) doesn’t formally certify mental health apps, it may authorize their marketing after reviewing pre-market submissions. The agency acknowledges the potential of digital mental health therapies to improve access to care.
Meanwhile, other startups are racing to provide AI-based support. Herbert Bay, CEO of Earkick, said his app Panda is designed with safety in mind, able to detect emotional crises or suicidal thoughts and trigger alerts. He contrasted Panda with more general-purpose chatbots like those from Character.AI, referencing a tragic incident involving a teen and an unregulated chatbot interaction.
Bay noted that AI companions are best suited for ongoing emotional support rather than acute mental health crises. “You can’t call your therapist at 2 a.m.,” he said, “but a chatbot is always available.”
Users like Darren, who has PTSD, say general-purpose AI such as ChatGPT has already proven helpful. “It’s working for me,” he shared. “I’d recommend it to others dealing with anxiety and emotional distress.”

