How A Machine Learned To Spot Depression
I'm in a booth with a computer program called Ellie. She's on a screen in front of me.
Ellie was designed to diagnose post-traumatic stress disorder and depression, and when I get into the booth she starts asking me questions — about my family, my feelings, my biggest regrets.
Emotions seem really messy and hard for a machine to understand. But Skip Rizzo, a psychologist who helped design Ellie, thought otherwise.
When I answer Ellie's questions, she listens. But she doesn't process the words I'm saying. She analyzes my tone. A camera tracks every detail of my facial expressions.
"Contrary to popular belief, depressed people smile as many times as non-depressed people," Rizzo says. "But their smiles are less robust and of less duration. It's almost like polite smiles rather than real, robust, coming from your inner-soul type of a smile."
Ellie compares my smile to a database of soldiers who have returned from combat. Is my smile genuine? Is it forced?
Ellie also listens for pauses. She watches to see whether I look off to the side or down. If I lean forward, she notices.
All this analysis seems to work: In studies, Ellie could detect signs of PTSD and depression about as well as a large pool of psychologists.
Jody Mitic served with the Canadian forces in Afghanistan. He lost both of his feet to a bomb. And Mitic remembers that Ellie's robot-ness helped him open up.
"Ellie seemed to just be listening," Mitic says. "A lot of therapists, you can see it in their eyes, when you start talking about some of the grislier details of stuff that you might have seen or done, they are having a reaction."
With Ellie, he says, he didn't have that problem.
Right now, Ellie is strictly for diagnosis. The idea is, once Ellie's out in the field, she'll find soldiers who are having a problem and a human will take it from there.
Copyright 2020 NPR. To see more, visit https://www.npr.org.