Wysa

AI detects 82% of mental health app users in crisis, finds Wysa’s Global study released on the role of AI to detect and manage distress

AI detects 82% of mental health app users in crisis, finds Wysa

15 April 2024, Boston: Data released today from Wysa’s AI mental health platform reveals the crucial new role that AI plays in mental health crisis mitigation. Wysa studied anonymous data from 19,000 users across 99 countries to understand instances of SOS and user behaviour after triggering the escalation pathway feature. The research follows the company reaching a milestone of 400 Wysa users publicly stating that  the app saved their life.

The global study found that 1 in 20 Wysa users (5.2%) reported crisis instances on the app in one year. 82% of these instances were detected by Wysa’s AI and confirmed by the user that they were having thoughts of self harm or suicide. This confirmation escalated the user to Wysa’s SOS interventions. The other 18% of crisis instances were self-selected by users, who were then able to access the best practice crisis resources on the app such as helplines,a safety plan and exercises that could help with grounding. 

When a crisis instance was confirmed, just 2.4% of people chose to call helplines when repeatedly encouraged to do so by the app. This indicates that even in moments of distress, few people feel ready to reach out for professional help, highlighting the benefit of alternative means of support in times of crisis. 

The personal safety plan, co-created on the app between Wysa and users, containing vital contacts, reasons to live, calming activities, and warning signs, emerged as the most utilised SOS resource. Over 49.2% of users in crises selected this feature, underlining its effectiveness in fostering resilience and self-reflection.

Grounding exercises were used by 46.6% of users in crises. These exercises, which include breathing exercises and mindfulness techniques, help users refocus on the present, increasing their awareness and helping them put their minds at ease.

One-third of crisis incidents occurred during the working day. Following the end of the typical workday, SOS triggers remained high, with 31% occurring between 6 pm and 12 am, and 28.1% post-midnight. The least usage of the SOS feature, at just 5%, was recorded in the morning, between 7 am and 9 am – before work has started. The data shows that a sense of hopelessness and worthlessness is a key determinant of whether someone reaches out for help (in 59% of cases), and this rises during the day. 

Jo Aggarwal, CEO and founder of Wysa, said: “This research sheds light on the role AI can play in mitigating mental health crises. It’s evident that AI serves a unique purpose by providing a psychological safe space for people to express concerns. For many individuals, AI conversations serve as a crucial support when they’re not comfortable discussing their worries with another person.”

Wysa is dedicated to upholding the highest standards in data protection, privacy, and clinical safety in all operational aspects, especially in managing crises related to self-harm or suicidal ideation among users. Its support interventions, conversations, and clinical flows are meticulously managed by experienced clinicians and psychologists. 

One user said: “Wysa sets a safety plan and helps you cope. This app made me feel better in a matter of three minutes. It made sure to check in, it was helpful, and it helped me calm down more and sleep better.”

The World Health Organization emphasises the importance of recognising suicide as a public health concern. This responsibility extends to employers, as they play a vital role in providing accessible, quality mental health support, especially considering that over 60% of the global population is engaged in work. Deloitte estimates that poor employee mental health costs UK employers between £42bn and £45bn each year.

Read the full report at: https://www.wysa.com/role-of-ai-in-sos

 

Wysa’s SOS Protocol 

Wysa’s SOS feature is designed to provide immediate support to individuals experiencing distress, employing a multifaceted approach for crisis detection.

The AI system is capable of identifying potential crisis situations during conversations with users, especially when there is a mention of suicidal ideation, trauma, abuse, or self-harm. In such instances, the AI seeks to confirm the risk, ensuring an accurate and sensitive response. 

Additionally, Wysa incorporates clinical screenings like the PHQ (Patient Health Questionnaire) depressive symptom scale and GAD (Generalised Anxiety Disorder) standardised questionnaire for depression, where the detection of harm risk is triggered by indications of suicidal thoughts from the user. Furthermore, Wysa empowers users with a direct avenue for seeking help through a user-initiated SOS button, readily available on the app’s home screen. This feature enables users to actively signal for assistance during a crisis. 

Once the SOS feature is triggered, Wysa responds by providing a list of local helpline resources. The app encourages individuals to contact these resources immediately, following up to check if the user has made contact. The feature further supports users by building a personal safety plan and guiding them through grounding exercises, offering a holistic approach to crisis intervention.

Contact

Francesca Baker

francesca@andsoshethinks.co.uk

www.andsoshethinks.co.uk

About Wysa 

Wysa is a global leader in AI-driven mental health support, available both to individuals, through employer benefits programmes and healthcare services. We believe access to support should be available whenever people need it. Stigma prevails, so we take away the need for people to ask for help and eliminate the need for people to make a judgement call on when they should seek professional support. Proven to improve depression and anxiety scores by an average of 31%, Wysa’s AI-first approach enables employees to improve their mental health before symptoms become severe, by understanding an individual’s needs and guiding them through interactive cognitive behavioural therapy (CBT) exercises. Wysa’s clinically safe AI encourages users to take additional support, whenever it’s needed, by guiding them towards Wysa’s human coaching, employer benefits programmes (EAP) or national crisis lines. Wysa has helped over 5 million people through 500 million AI conversations across 90 countries.