Mental health disorders can be hard to treat because symptoms change quickly and patients have many different experiences. Two main disorders studied with AI at Dartmouth are major depressive disorder and opioid substance use disorder. These disorders include changes in behavior as well as changes in the brain and thinking, which can be hard to notice but are important to catch early for better treatment.
Cognitive science studies how people think, learn, and understand information. It helps teach AI to notice these small but important changes. AI alone can look at lots of data, but without human thinking models, it might miss what really matters or misunderstand what patients need.
Steven Frankland, a cognitive scientist at Dartmouth, works on connecting human intelligence with AI. His research tries to make AI systems flexible, so they can change and adapt when they get new information or when a patient’s behavior or surroundings change. This flexibility helps AI understand patients in different situations and react the right way. For mental health workers and managers in the US, this AI flexibility could lead to better diagnoses and treatment plans made for each patient’s specific condition.
The AI Research Institute on Interaction for AI Assistants (ARIA) is a national project focused on using AI for mental health care. It has $20 million in funding over five years from the National Science Foundation and is one of five major AI institutes in the country. Dartmouth’s Center for Technology and Behavioral Health leads this research, mixing AI with ways to monitor behavior to better understand and help people with mental health disorders.
ARIA starts by working on real-time detection and support for major depressive disorder by tracking physical signals, thinking patterns, and behavior. In the second year, the focus shifts to opioid use disorders, looking at brain connections and outside factors that might cause someone to use drugs again. This step-by-step method uses what is learned in one area to help improve care in others.
Andrew Campbell, a Dartmouth researcher leading the AI mental health project, ran a four-year study called StudentLife. It used a phone app to watch student mental health over a long time—the longest study of this kind. This shows how mobile phones and AI can track changes in mood and behavior. For health care managers and IT teams, this research points to using digital tools to watch patients outside of regular doctor visits.
One way cognitive science helps AI at Dartmouth is through tools like MoodCapture and MindScape. MoodCapture uses AI and phone cameras to read facial expressions and spot early signs of depression. This works because it understands how faces show changes in mental health. MindScape goes further by combining behavior detection with ChatGPT to give mental health help based on what someone is doing and feeling right now.
Nicholas Jacobson, who leads AI psychotherapy research at Dartmouth, says AI support must be safe, reliable, and truly personalized. This means AI should support people carefully and not give general advice that might not fit each person’s situation.
In hospitals or clinics, this kind of smart care can help patients stay involved with their treatment, follow plans better, and get better results. Managers can use these AI tools to support care teams with digital helpers that provide constant support, allowing clinicians to work on the hardest cases.
Dartmouth’s AIM HIGH Lab developed Therabot, the first AI psychotherapy chatbot tested in a clinical trial. Therabot is made to talk with patients and give helpful therapy replies based on what the patient says.
Cognitive science helps by explaining how people talk about feelings, problems, and thoughts during therapy. Therabot’s success depends on AI’s ability to copy how people speak and respond well as the patient’s feelings change.
For healthcare managers, using AI chatbots means clinics can offer support after normal hours. It helps keep in touch with patients, collect information about symptoms, and give quick help. This is especially useful for people living far from clinics or in areas with few services.
ARIA also focuses on training workers in AI and healthcare from middle school to postdoctoral levels. This training mixes cognitive science, computer science, and psychiatry to prepare people to build and use AI mental health tools well.
The program works with current mental health workers too, offering workshops, discussions, and certificates to help new AI tools become part of regular care. This helps close the gap between what research finds and what is practical in clinics.
Healthcare managers and IT leaders can benefit by working with these education programs and networks connected with ARIA. This helps their staff learn about AI and how to use it in healthcare.
One key area where cognitive science and AI help hospitals is by automating routine tasks. Things like booking appointments, answering patient questions, and following up take up a lot of staff time in medical offices. AI can help with these tasks to reduce the workload.
Companies like Simbo AI develop phone systems using AI to handle patient calls. They can schedule appointments, answer common questions, and remind patients about visits or medicine refills.
When these AI assistants understand patients and their situations, they can communicate more clearly and kindly. This matters a lot for mental health patients, who might feel nervous or confused when dealing with healthcare.
For IT managers, using AI phone systems means shorter wait times, fewer missed appointments, and smarter use of staff time. This helps clinics run better and keeps patients happier.
Also, AI tools that study data from devices or wearables can find patients who might have a mental health crisis coming. This lets front-office workers focus on care for those patients first. These AI tools work because they understand human behavior and body signals in the right context.
Scientists working with ARIA try to make AI think more like humans, but in a way computers can use. This helps AI tools for mental health be flexible and able to understand different behaviors, cultures, or environments that affect mental health.
This flexibility also helps fix problems when AI depends too much on limited data or set rules. By using ideas from human thinking and behavior science, Dartmouth’s research makes sure AI helps doctors without taking their place.
For leaders running mental health clinics, smart and aware AI systems mean fewer mistakes in diagnosis and timing of treatments. They also help make sure AI advice fits well with what doctors think is right.
Dartmouth’s work, supported by a $20 million NSF grant, shows that mental health care needs smart and adaptable technology. Their effort mixes wearable sensors, mobile apps, digital therapy tools, and education programs to change how mental health care is done for many people.
For medical practice owners and managers in the US, especially those running mental health centers, knowing about these advances is important. Using AI systems based on cognitive science helps healthcare places meet patient needs better, improve results, and run more smoothly.
IT managers supporting these tech changes need to find systems that handle complex behavior data while keeping patient privacy and following healthcare rules.
Adding cognitive science to AI at places like Dartmouth is making mental health tools that are smarter, more flexible, and aware of the situation. These tools can notice quick changes in patient behavior, give feedback fit to the person, and keep patients involved beyond normal clinic visits.
For healthcare leaders focused on mental health in the US, using AI built on cognitive science offers ways to improve care quality, make patients happier, and run clinics better. Working together with researchers, doctors, IT staff, and managers will be important to bring these tools into everyday mental health care safely and well.
Dartmouth serves as a leading partner in ARIA, a national AI research institute focused on AI assistants for mental and behavioral health. It leads ‘use-inspired’ research integrating AI with devices and wearable sensors to provide personalized mental health assessment and intervention.
ARIA aims to develop AI-powered assistants capable of trustworthy, sensitive, and context-aware interactions addressing mental health and substance-use needs, advancing best practices through scientific evidence and personalized real-time feedback.
Dartmouth initially targets major depressive disorder and substance use disorders, specifically opioids, by studying physiological, behavioral, and neural data to produce timely interventions preventing symptom onset and relapse.
Lisa Marsch leads behavioral and mental health research; Andrew Campbell oversees AI mental health testbed and technology integration; Nicholas Jacobson directs AI psychotherapy chatbot development and digital biomarkers; Steven Frankland coordinates cognitive science research and computational modeling for adaptive AI.
Dartmouth developed the first FDA-authorized digital addiction treatment, pioneered use of behavioral sensing in clinical trajectories, and conducted longitudinal studies like StudentLife evaluating mental health through mobile and wearable tech.
By combining sensor and neuroimaging data with AI and cognitive neuroscience, Dartmouth creates models that interpret physiological and environmental cues, enabling real-time personalized interventions and predictive analytics for mental health management.
AIM HIGH develops generative AI psychotherapy chatbots undergoing clinical trials and works on digital biomarkers to predict individual mental health changes, ensuring AI systems are personalized, safe, and clinically valid.
ARIA provides training and curricula for middle, high school, undergraduate, and graduate students, alongside mentoring and interdisciplinary collaboration, preparing the next generation of AI scientists and mental health professionals.
ARIA organizes workshops, roundtables, and certificate programs to educate mental health practitioners, facilitating efficient clinical implementation of AI innovations and bridging research with practice.
It bridges human intelligence and AI, aiming to build systems with flexible and reliable understanding of users and environments, which is vital to developing effective AI interventions for complex mental health disorders.