Challenges (& Solutions) to Scaling Mental Health Benefit Programs

How to successfully scale mental health benefit programs at your organization, without compromising on quality, cost, or employee privacy.

Innovations in digital health technology are challenging traditional bricks-and-mortar care in terms of costs, accessibility, and quality through new data insights. The enduring and continuously growing challenges to breaking down the many barriers associated with access to care are at last being remedied by new virtual care approaches that provide a better experience for everyone involved. However, in mental health and well-being, organizations still face major obstacles when it comes to scaling mental health programs that address the vastly varying levels of care that employees need. This blog highlights how to successfully scale mental health benefit programs at your organization, without compromising on quality, cost, or employee privacy.

Scalability

According to a recent report, there is a dangerous imbalance of provider resources to patient needs, with 77 percent of counties across the US having severe shortages of behavioral health professions. By 2025, the expected demand will top 60,000, leaving a shortfall of over 15,000 professionals. The Health Resources and Services Administration reports that we “need to add 10,000 providers to each of seven separate mental healthcare professions by 2025 to meet the expected growth in demand.”

As new digital health companies emerge in the market, all showing promising ways to improve access to care, how can we meet consumer demand while ensuring the highest quality care and experience?  

Quality of Care

Theoretically speaking, conversational AI closes the gap in access to care for an entire employee population. However, for it to successfully do so, conversational AI must be powered by clinically validated protocols based on evidence-based psychological treatments such as Cognitive Behavioral Therapy (CBT). As human beings, our mental health ebbs and flows in response to obstacles and challenges that we face every day.  Clinically-driven AI that integrates with existing resources at your organization (EAPs, crisis lines, etc) creates an intelligent pathway to guide users based on where they are at in their journey.

Provide the right resources at the right time

Let’s take the example of an individual who is battling minor anxiety symptoms. They’ve struggled to fall asleep for the last two nights. Getting knots in their stomach. Tense shoulders. They don’t necessarily need a therapy appointment. However, they might need to talk with their AI well-being companion and listen to a guided meditation to help reframe the causes of their stress. This is proactive mental healthcare. People with mild symptoms can build mental resilience skills that could ultimately help them avoid higher acuity symptoms if left unattended to accumulate with other causes of anxiety.  

At the other end of the spectrum, let’s consider an individual who is experiencing repeated and enduring insomnia, isolating tendencies, and struggling to manage with day to day activities such as getting up in the morning for work. They clearly require a significantly higher level of intervention but may not even be willing to face up to this need. In a traditional setting, this individual would need to recognize their needs, perhaps after a friend or coworker points out that it’s time to access support. They must then accept the need and request or apply for help through their benefits scheme or insurer to access EAP. They may then wait days, or weeks, to speak to a therapist. As time passes there is an increased risk of their symptoms deteriorating. 

The new approach adopts a full population method through wellbeing for all. As well as eliminating the stigma of having to step forward and ask for help, the ‘digital front door’ approach offers automated triage providing immediate access to suitable interventions through self-help resources guided by AI. For those who perhaps don’t even realize their own level of need, mental health status questionnaire inquiries, interjected into the user’s conversations with the AI, can automatically triage the user towards human support as required, perhaps coach sessions, third-party EAP provision or emergency helpline support should matters rapidly escalate. 

These are just two examples, but each individual is unique and should have access to the right care, at the right time.

Access to care

There are many conversations happening throughout the mental healthcare industry around “access to care,” but what does this actually mean to you and your employees?  Here are three questions to ask yourself when evaluating if your people truly have access to the care they need:

  1. How quickly can employees connect to care? If employees are waiting weeks or even days to be connected with mental health resources, this is a barrier to care.
  2. Do they have 24/7 access to care? Can employees struggling with mental health in the middle of the night use their on-demand, integrated resources to get help? If they don’t have resources outside of standard business hours, this is a barrier to care.
  3. Do they have unlimited access to resources? Can they use any of these resources as much as they would like or need? If there is a limited number of sessions based on PEPM, that’s a barrier to care.

Adoption and adherence

Research shows that digital mental health interventions are often associated with relatively poor adoption and adherence for a multitude of reasons. One reason is that there is poor engagement with digital tools. This could be due to an insufficient therapeutic alliance, bond, or trust. Growing numbers of studies support natural language understanding (NLU) based AI companions and interventions as an effective and feasible tool for the delivery of mental well-being to individuals with self-reported anxiety and depressive symptoms.

Adoption

In this peer-reviewed Wysa clinical study in Frontiers, researchers examined users’ perspectives of the therapeutic bond with a conversational agent.  It points to the fact that users can develop an emotional bond with a chatbot in much the same way people bond with a human therapist. This study supported the original theory that digital mental health interventions can establish therapeutic bonds, and free-text conversations support a stronger therapeutic bond, leading to higher adoption and adherence. This has enormous potential for scaling support to large populations, such as multinational companies, and even entire countries.

Privacy

Now, more than ever, it is critical to evaluate the privacy policy of mental health providers to protect your employees’ personal information. In a recent report by Mozilla, Privacy Not Included, the team analyzed a variety of popular mental health apps, exposing a large percentage for tracking, sharing, and capitalizing on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data. Mental health apps are designed to require users to divulge sensitive issues like mental health conditions, yet collect large amounts of personal data under vague privacy policies.

Anonymity ensures true user privacy

In lieu of uncovering privacy breaches, complete anonymity is the truest form of user privacy. By not collecting personal information such as name and email, conversations are secure, private, and completely anonymous, and cannot be tied to individual identifiers. Beyond privacy, anonymity creates a safe environment that diminishes stigma and encourages users to take the first step in starting their mental health journey, which is usually the hardest to do.

Considerations

New digital health technologies are challenging traditional care by reducing costs while improving access to quality care. Mental health programs that support people with a wide range of needs are being scaled successfully through new virtual care approaches.

bbc
wntrepreneur
orcha
wall street
bloomberg