Skip to main content

Advertisement

Advertisement

Advertisement

ADVERTISEMENT

Blog

Using AI to Build a Bridge Between Men and Mental Health Care

Athena Robinson, PhD
By Athena Robinson, PhD

Traditional conceptualizations of people who self-identify as men often paint a picture of people devoid of negative emotions—particularly sadness, loneliness, and grief. However, the reality is that 6 million men are affected by depression in the United States every year, men are at an increased risk for suicide and substance use disorder, and men are less likely to seek mental health treatment. The proof that this demographic needs mental healthcare is staring us all in the face.

But why are so many men not asking for help from those qualified to give it? In part, because the gender binary has been constructed such that in many cultures, those identifying as male often feel pressure to conform to certain norms of what it means to “be a man.” Stigma and/or shame also play a role in thwarting treatment seeking behaviors. Relatedly, recent research has linked some aspects of masculine norms to greater risk of thoughts about suicide. Indeed, conforming to archetypal masculine norms can be extremely limiting when it comes to self-expression and self-care—2 things that are essential to anyone’s mental and physical health, no matter their gender identity.

As we deepen our understanding of the growing emotional impact of the COVID-19 pandemic, men with this risk profile need a trusted, accessible outlet to support them as they navigate distressing emotions. Because artificial intelligence (AI)-driven solutions are inherently nonhuman and nonjudgemental, they have the potential, when evidence-based and empirically tested, to present a new way to seek mental health care, especially for those facing stigma or feeling shame around their mental illness.

AI as a stepping stone for overcoming stigma

Working in tandem with a wide variety of mental health advocacy efforts, AI will be one of our most powerful tools for breaking down stigma-driven barriers. AI doesn’t judge. AI doesn’t feel or think anything. AI can be a tool to help people sort through their thoughts and emotions, and teach strategies for identifying and moving past distorted thought patterns. In fact, it’s been found that, when interacting with a computer, people report lower fear of self-disclosure, lower impression management, displayed their sadness more intensely, and were rated by observers as more willing to disclose.

In further support of this, a May 2021 study challenged the traditionally held notion that digital mental health interventions are, by definition, limited because they don’t involve a human provider. The study found the following 3 key conclusions:

  • A well-designed relational agent can develop a rapport with users that is similar to the kind of rapport seen in traditional human delivered treatment.

  • Users feel this bond with the relational agent in as early as 3 days.

  • The therapeutic bond holds steady over time.

Considered a foundational aspect of all health care delivery and a necessary condition for change, therapeutic bond within a mental health context is measured as an element of the Working Alliance Inventory-Short Revised (WAI-SR). The WAI-SR is a measure that assesses 3 key aspects of the therapeutic alliance: agreement on the tasks of therapy; agreement on the goals of therapy; and, as explored in this study, development of an affective bond.

The formation of a bond between people and relational agents suggests that this form of digital therapeutic can be a viable gateway to mental health care support for men, specifically. Relational agents can augment and enrich the therapeutic experience for both patient and therapist, break accessibility barriers, relieve our overburdened mental healthcare system and so much more.

Yet, the technology’s greatest promise may be as a bridge to a future where people of all gender identities can access mental health care, without stigma or shame. The ability to establish a bond, and to do so with millions of people simultaneously, is the secret to unlocking the potential of digital therapeutics like never before.


Athena Robinson, PhD, is Chief Clinical Officer at Woebot Health and Adjunct Clinical Associate Professor at Stanford’s School of Medicine. At Woebot Health, Athena oversees the company’s regulatory strategy and overall program of research, as well as the empirically-supported psychotherapeutic underpinnings of the Woebot Health products.

The views expressed on this blog are solely those of the blog post author and do not necessarily reflect the views of the Psychiatry & Behavioral Health Learning Network or other Network authors. Blog entries are not medical advice.

Advertisement

Advertisement

Advertisement

Advertisement