Depression Detection
Mobile app could detect mental health problems
by Chris Carroll | illustration by Jeanette J. NelsonPeople suffering from depression often fail to realize when their situation has become desperate. A new app being designed by three UMD researchers could soon detect a worsening mental state and make the crucial link to care automatically.
Their smart mobile system will monitor a range of subtle signs and symptoms—vocal inflections, slight facial movements, language content—and integrate them into an objective mental health snapshot.
Though still in its formative stage, the system could one day gather all the data it needs from patients simply speaking to their smartphone for a few minutes while wearing a heart rate monitor like an athlete might use.
“If it works like we hope, [mental health professionals] would get advance warning, and they’d be able to call the patient in to adjust medication, for example, or provide counseling,” says Monifa Vaughn-Cooke, an assistant professor of mechanical engineering.
The sound of the user’s voice alone will indicate potential problems, says her research partner, Carol Espy-Wilson, a professor of electrical and computer engineering with a joint appointment in the Institute for Systems Research.
“We know people do talk differently depending on the mental state they’re in, and so we are looking for those markers,” she says.
Among the markers the system will track, she says, are shimmer, a measure of short-term variations in loudness of vowel sounds, and jitter, or short-term variation in pitch. Both increase among people experiencing depression, Espy-Wilson’s research shows. Breathier speech is another depression indicator, along with slower speech.
Philip Resnik’s contribution to the developing system isn’t concerned with how people sound, but what they say—although what people might think they’re saying and what the system hears will likely be very different.
Just as an email system learns to look for certain words to filter out a scam email, systems can learn features of language that tend to be associated with depression. One example might be language indicating low energy levels or emotional stress, says Resnik, a professor of linguistics with a joint appointment in the Institute for Advanced Computer Studies. His research in cutting-edge statistical language analysis known as topic modeling could help the app go a step further and uncover telling patterns hidden deep in language.
“One of the values in creating a system like this eventually is that it does not depend on self-reporting, because people have biases and sometimes they lack the awareness to self-report accurately,” he says.
In the Hybrid System Integration and Simulation Lab, Vaughn-Cooke has been gathering physiological feedback from research subjects as they answer a series of questions like “What was the saddest part of your day?” Computer analysis of facial expressions for emotion, heart rate data, breathing rates—all of it will eventually feed into the system alongside the audiological and linguistic signals uncovered by Espy-Wilson’s and Resnik’s research.
This broad, integrated approach to using technology to assess mental health is the first of its kind, and could one day improve treatment for the seven percent of U.S. adults the National Institutes of Mental Health estimates suffer yearly from major depression.
“Ultimately, we’re beginning this collaboration to triangulate,” Resnik says. “We’re coming at this question from multiple directions.”
0 Comments
Leave a Reply
* indicates a required field