Mental Health

3 ways AI could help our mental health

Prince Harry speaks with members of fellowship groups during a visit to a Help For Heroes Recovery Centre at Tedworth House, where he learnt more about the mental health support military veterans are receiving, in Tidworth, Wiltshire, Britain January 23, 2017. REUTERS/Ben Birchall/Pool - RC155C2DA170

Britain's Prince Harry has campaigned against the stigma of mental illness. Image: REUTERS/Ben Birchall

Charlotte Stix
Research Fellow, Eindhoven University of Technology
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Mental Health?
The Big Picture
Explore and monitor how Mental Health is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Mental Health

Mental health difficulties affect around 1 in 5 adults at least once in their lifetime in the United States. But it's not just the US. Poor mental health is a global issue, with 83 million people affected in Europe alone. What if we could harness Artificial Intelligence (AI) to address this?

AI is increasingly hyped as a silver bullet, applicable to almost all areas, ranging from economic prosperity, to solving complex global issues. While the holistic impact of AI remains to be seen, the case for using AI within mental health is surprisingly encouraging, as backed by medical studies and pilot programmes.

Mental health in the US. Image: National Alliance on Mental Illness

In its current form, AI is still merely a support mechanism. But looking towards the future, its impact can be significant, provided that further research is backed and shortcomings, such as unclear data usage, misdiagnosis and privacy concerns, addressed.

Let's look at three particularly noteworthy benefits of AI.

Early detection

Early detection of mental health difficulties is of crucial importance to the prompt and successful treatment of the patient. AI can already detect markers that indicate a high probability of cancer at very early stages. What if AI could flag up similar warning signs about your mental health, simply by listening to you?

Traditional practice in mental health largely relies on the individual to observe and self-report indicative changes, alongside the observations of mental health professionals. AI could notice relevant symptoms and act as an early detection mechanism, as demonstrated by two recent case studies.

Veterans are considered a typical high-risk group for developing mental health difficulties. To catch these developments early on, Cogito – a company funded by the Defense Advanced Research Projects Agency – teamed up with the US Department of Veterans Affairs to trial an app that monitors veterans' mental health.

The app itself, called Companion, passively monitored a veteran's phone 24/7, by listening to the sound of the user's voice and their frequency of mobile phone usage. The changes in inflection, energy of pitch and amount spoken, as well as phone usage provided the app with a variety of behavioural indicators. The AI system then used these indicators to detect crucial changes in the user's mental health.

Similar to Cogito, IBM also harnesses AI as an early detection mechanism for mental health. In two studies, IBM's Computational Psychiatry and Neuroimaging group, alongside several universities, aimed to predict the onset of psychosis in patients.They built an AI that detected differences in speech patterns between high-risk patients who develop psychosis and those who did not. To detect this, they used a method called Natural Language Processing (NLP). NLP analysed the patient's speech for different indicators, such as coherence of speech and ideas. It then built a predictive model for the onset of psychosis. After training this AI system over two studies, IBM achieved an incredible 83% of retrospective accuracy of detection in the second study group. It was a quantifiable demonstration of the power of listening.

Easy accessibility

Approximately 45% of the world's population in 2014 lived in a country where there was less than one psychiatrist for every 100,000 people. It is clear that access to treatment is a luxury that many people around the globe do not have or cannot afford.

On top of improving access to mental health treatments, AI can play a big role within personalised treatments. Ginger.io, for example, covers both. An online platform that uses AI and machine learning alongside a staffed clinical network, Ginger.io tailors its suggestions to the needs of the user and provides access to a variety of treatments.

The algorithm might, for example, suggest that the most suitable course of action is cognitive behavioural therapy (CBT). CBT is a popular talking therapy that aids to reframe the way you think and behave, to change the way in which you address problems. It usually requires several visits to a professional over an extended period of time, which might be unachievable because of the user's location. Other treatments the user might be referred to are mindfulness training, resilience training or being escalated to a licensed therapist or board-certified psychiatrist, depending on severity of symptoms.

All signs indicate that AI is set to become a key driver in lowering barriers of access to advice, services and personalised treatment.

Lowered fear of stigma

Stigma surrounding mental health can act as a strong deterrent to seek help and speak out. Some people affected may not wish to discuss their situation with other individuals, including trained professionals, for fear of stigma. In the long run this can contribute to a worsening of the person's situation.

As opposed to fellow humans, an AI does not necessarily form part of any wider social construct with all the associated cultural norms and expectations. The AI is likely perceived as non-judgmental, non-opinionated and overall neutral.

The opportunity to confide in an AI system has been within reach for quite some time. ELIZA, a basic NLP program developed in 1966, reenacted the behaviour and responses of a psychotherapist. An early predecessor to many subsequent chatbots, its creator's intentions can be seen as particularly aligned with those of Woebot's.

Woebot works in a similar way to an instant messaging app. Created by clinical research psychologist Dr Alison Darcy and integrated on Facebook, Woebot aims to replicate the open ear of a trained professional. It learns about the individual and tailors its questions to their situation through repeated conversations.

Woebot doesn't tire of lengthy conversation, is always available to listen and, most importantly, it is perceived as non-judgemental, no matter what thoughts and worries the user expresses. In this light, Woebot can contribute to increased well-being by reducing isolation, providing an instant channel of communication, and allowing for anonymous self-expression.

AI may not be a silver bullet for mental health yet, but it has all the indicators of making a significant contribution in the field.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Mental HealthArtificial Intelligence
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

From 'Quit-Tok' to proximity bias, here are 11 buzzwords from the world of hybrid work

Kate Whiting

April 17, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum