Skip to main content

Advertisement

ADVERTISEMENT

Patients share more with computers than clinicians, but that can be good

 Post-traumatic stress disorder (PTSD) is underdiagnosed and undertreated among service men and women, according to Patrick B. McGrath, PhD, director, Alexian Brothers Center for Anxiety and Obsessive Compulsive Disorders in Hoffman Estates, Ill. Part of the reason is because many are hesitant to talk about their symptoms. They worry they will be seen as weak.

“In people who have been in service, there’s a reluctance to talk to people who have not been in service because there’s a belief that ‘you just can’t understand what I’ve been through,’” McGrath says. “And it’s not all that different from what we see with other diagnoses.”

But what if an unlikely tool could help military personnel  begin the discussion that could lead to improved diagnosis and treatment? It would seem like a perfect solution. But what if that tool was a virtual human—a computer programmed to act like a person, speaking, sympathizing, prompting and reading body language?

It might sound chilling—a bit too much like the classic movie, “2001, A Space Odyssey,” with the  computer named “HAL” that had a mind of its own. Obviously the movie was pure fiction, but virtual humans built with today’s technology show potential to help clinicians and their PTSD patients.

Virtual human

According to a study by the University of Southern California (USC), patients are more willing to disclose their depression and PTSD symptoms when talking to computerized virtual humans than when talking to real humans. While the virtual human obviously can’t take the place of a clinician in diagnosis and treatment, it can be a tool to help patients start talking.

Participants in the USC study were interviewed by a virtual human that was able to interpret not just the content of what the subjects said but also their tone of voice and nonverbal cues. In intake interviews, people were more honest about their symptoms, no matter how potentially embarrassing, when they believed that a human observer wasn’t listening. They were asked questions about their sleeping habits, their mood and their mental health.

Source: Department of DefenseThe study was funded by the Defense Advanced Research Projects Agency and the U.S. Army.

“All the research is suggesting that even though the information has to be released to the overseeing physician eventually, when the responses are unobserved in the moment where the person has responded, patients are still willing to share more information than if a human were watching  them give the information,” says Gale Lucas, a social psychologist at USC’s Institute for Creative Technologies, who led the study.

Unlike previous research that might compare paper documents to live interviews, Lucas says, the USC study was able to isolate the impact of speaking anonymously and make a direct connection. The only variable was whether the subjects believed a human was watching them in real time. Those who believed no one was listening spoke up more about depression and PTSD symptoms.

“We really isolated that it’s the impact of being unobserved that’s leading to this outcome,” Lucas says.

Draw the line

There’s a limit to what humanlike computers can do, of course,  and they’re certainly not going to replace trained clinicians. Virtual humans, for example, can’t make a judgment call on potential suicidal ideation, but they can give clinicians feedback based on specific, measureable thresholds. For example, they can use audio input to measure the level of distress in a patient’s voice and the content of what the patient says and quantify those readings against a baseline.

However, the nature of human interaction is incredibly complex—far too complex for a computer to interpret with 100% accuracy. In the USC study, patients were aware that the virtual humans could misinterpret the practical meaning of what was said, but found benefit in using the tool.

“In using these tools, we want to recognize their limitations and still reap the benefit,” Lucas says. “Let the virtual human do its job, and let the physicians do their jobs, and let each do what they’re good at.”

Other applications

Computer interaction is currently in use in behavioral health facilities. The Alexian Brothers Center for Anxiety and Obsessive Compulsive Disorders offers a different type of virtual reality program, which is used for treating PTSD. In its program, individuals use headsets that offer a 360-degree view of a simulated scene—such as a Humvee driving through a desert landscape—which they control with a device that is the same shape and weight of a standard military machine gun. Patients experience virtual situations that have the potential to stir up their anxiety. McGrath says the goal is for anxiety to decrease over time through the habituation process, and some patients notice relief after about 15 one-hour sessions.

“It’s awesome to me when someone takes off the headset and says, ‘I’m getting kind of bored with this,’” McGrath says.

He is familiar with the Institute for Creative Technologies at USC and says researchers are always trying to improve the virtual reality programs, driving toward the next useful innovation.

State-of-the-art

Computers that recognize natural, spoken language have long been used in industries that rely on telephone interactions, such as customer service centers. But  when considering emotionally based responses, natural language has far more nuances. In other words, the spoken response “yes” is much easier for a computer to discern than “I’m having vivid nightmares.”

Lucas says the virtual human technology of today far surpasses the older customer-service call technologies that only understand limited, preprogrammed responses. Innovations today include empathetic responses, for example. A patient might say he has nightmares, to which, the virtual human might say, “That must be very hard for you.”

Part of the reason why patients are more willing to disclose to a computer, Lucas says, is that the patient is essentially safe from the additional anxiety of making a human listener feel sad or horrified when sharing heart-wrenching stories of traumatic events. There’s no such risk with a computer. Additionally, a computer doesn’t make a personal judgment about the patient’s character or weaknesses, which they might experience in the real world.

After all, service men and women are trained to be resilient and not show emotion.

“When you’re in the military, that makes a lot of sense, but in civilian life, that can cause a lot of problems,” McGrath says.

 

Data collection

USC’s technology was designed to produce structured data. Data collected by the virtual interaction is processed into a report that helps clinicians estimate the likelihood of PTSD. For example, a patient that has a 90% likelihood of having or developing PTSD might be referred for diagnosis and treatment. USC has more than 500 interactions collected into its baseline data that builds the reports, Lucas says.

The USC study also brought to light the possible broader implications of virtual human screening tools: Patients appeared to be more comfortable disclosing information overall, even after the interview session was over. Lucas says the virtual human allows patients to experience opening up about their difficulties, which might allow them to become more at ease with opening up in other situations as well.

While a study might provide some ideas, the true test will be in real-world use. The virtual human is currently being applied in Colorado where service men and women were screened for depression and PTSD prior to deployment. When they return in several months, their analyses will include virtual human interviews to help screeners determine psychological distress.

 

 

Advertisement

Advertisement

Advertisement