Patients share more with computers than clinicians, but that can be good | Behavioral Healthcare Executive Skip to content Skip to navigation

Patients share more with computers than clinicians, but that can be good

August 8, 2014
by Julie Miller
| Reprints
Virtual humans can measure verbal and nonverbal responses

 Post-traumatic stress disorder (PTSD) is underdiagnosed and undertreated among service men and women, according to Patrick B. McGrath, PhD, director, Alexian Brothers Center for Anxiety and Obsessive Compulsive Disorders in Hoffman Estates, Ill. Part of the reason is because many are hesitant to talk about their symptoms. They worry they will be seen as weak.

“In people who have been in service, there’s a reluctance to talk to people who have not been in service because there’s a belief that ‘you just can’t understand what I’ve been through,’” McGrath says. “And it’s not all that different from what we see with other diagnoses.”

But what if an unlikely tool could help military personnel  begin the discussion that could lead to improved diagnosis and treatment? It would seem like a perfect solution. But what if that tool was a virtual human—a computer programmed to act like a person, speaking, sympathizing, prompting and reading body language?

It might sound chilling—a bit too much like the classic movie, “2001, A Space Odyssey,” with the  computer named “HAL” that had a mind of its own. Obviously the movie was pure fiction, but virtual humans built with today’s technology show potential to help clinicians and their PTSD patients.

Virtual human

According to a study by the University of Southern California (USC), patients are more willing to disclose their depression and PTSD symptoms when talking to computerized virtual humans than when talking to real humans. While the virtual human obviously can’t take the place of a clinician in diagnosis and treatment, it can be a tool to help patients start talking.

Participants in the USC study were interviewed by a virtual human that was able to interpret not just the content of what the subjects said but also their tone of voice and nonverbal cues. In intake interviews, people were more honest about their symptoms, no matter how potentially embarrassing, when they believed that a human observer wasn’t listening. They were asked questions about their sleeping habits, their mood and their mental health.

Source: Department of DefenseThe study was funded by the Defense Advanced Research Projects Agency and the U.S. Army.

“All the research is suggesting that even though the information has to be released to the overseeing physician eventually, when the responses are unobserved in the moment where the person has responded, patients are still willing to share more information than if a human were watching  them give the information,” says Gale Lucas, a social psychologist at USC’s Institute for Creative Technologies, who led the study.

Unlike previous research that might compare paper documents to live interviews, Lucas says, the USC study was able to isolate the impact of speaking anonymously and make a direct connection. The only variable was whether the subjects believed a human was watching them in real time. Those who believed no one was listening spoke up more about depression and PTSD symptoms.

“We really isolated that it’s the impact of being unobserved that’s leading to this outcome,” Lucas says.

Draw the line

There’s a limit to what humanlike computers can do, of course,  and they’re certainly not going to replace trained clinicians. Virtual humans, for example, can’t make a judgment call on potential suicidal ideation, but they can give clinicians feedback based on specific, measureable thresholds. For example, they can use audio input to measure the level of distress in a patient’s voice and the content of what the patient says and quantify those readings against a baseline.

However, the nature of human interaction is incredibly complex—far too complex for a computer to interpret with 100% accuracy. In the USC study, patients were aware that the virtual humans could misinterpret the practical meaning of what was said, but found benefit in using the tool.

“In using these tools, we want to recognize their limitations and still reap the benefit,” Lucas says. “Let the virtual human do its job, and let the physicians do their jobs, and let each do what they’re good at.”

Other applications

Computer interaction is currently in use in behavioral health facilities. The Alexian Brothers Center for Anxiety and Obsessive Compulsive Disorders offers a different type of virtual reality program, which is used for treating PTSD. In its program, individuals use headsets that offer a 360-degree view of a simulated scene—such as a Humvee driving through a desert landscape—which they control with a device that is the same shape and weight of a standard military machine gun. Patients experience virtual situations that have the potential to stir up their anxiety. McGrath says the goal is for anxiety to decrease over time through the habituation process, and some patients notice relief after about 15 one-hour sessions.

“It’s awesome to me when someone takes off the headset and says, ‘I’m getting kind of bored with this,’” McGrath says.

He is familiar with the Institute for Creative Technologies at USC and says researchers are always trying to improve the virtual reality programs, driving toward the next useful innovation.