While many people face mental health struggles, Gen Z seems to be experiencing them in greater numbers. According to McKinsey & Company’s 2022 American Opportunity Survey, Gen Z reported the highest incidence of mental health concerns.
Though more open than older generations about discussing topics like anxiety, depression, and trauma, Gen Z doesn’t necessarily have better access to care. Cultural, financial, and social barriers can still prevent many from seeking professional help. As the world faces an increase in the demand for mental health services, a fresh alternative is emerging: AI-powered chatbot self-help.
AI-guided self help could offer low-cost, accessible mental health support at scale, meaning it can help fill gaps for a large number of people when traditional care may fall short. The question still remains, however, whether these tools can truly bridge the mental healthcare divide between need and accessibility, and provide the same level of support that a human therapist can. In this story, Wysa looks at where AI-powered self-help is today, the impacts it has on different communities, especially younger ones, and what limitations still persist in the use of this technology.
The mental health landscape for Gen Z
Members of Gen Z (the generation born between the mid-1990s and early 2010s) often speak openly about mental health and use the internet to reach out to one another in times of crisis. Social media platforms are brimming with personal stories, DIY coping tips, and mental health advocacy from younger users. But while awareness of mental illness and the need to get help is high, access to human-driven care is not always available.
Unfortunately, one of the biggest roadblocks for people seeking treatment is the cost. Despite being an essential service, therapy costs in excess of $100 per session in the U.S., even with insurance. For many in the Gen Z community, this expense is simply out of reach. According to the McKinsey report, 1 in 4 Gen Z respondents reported being unable to afford mental health care, making them the generation most likely to cite cost as a barrier.
This generation is also facing long waitlists for therapists due to a national shortage. Rural areas and underserved communities experience even less access to providers, and therapists who are culturally competent, trauma-informed, or gender-affirming are in short supply. Add in the stigma that still exists in some families and cultures, and the result is a large proportion of patients who are in need, but have nowhere to turn other than to their computers for help. That is where AI enters the equation.
The rise of AI chatbots
AI-powered apps use conversational AI to provide engaging self-help support. These tools are becoming an appealing alternative to human help for people with manageable symptoms. They’re available anytime you need them, never get tired or impatient, and provide anonymity in a world where many feel they might be judged for their thoughts. Among people actively seeking therapy, Yahoo has reported that 21% are open to using AI platforms for their needs.
Why Gen Z is turning to AI for mental health
Over a third of Gen Z and millennials (36%) are interested in using AI for mental health support. Here are some reasons why Gen Z can be drawn to AI therapy bots:
- Financial accessibility. Traditional therapy is often not covered by insurance or comes with high out-of-pocket costs. Apps that offer free or low-cost plans make mental health support more affordable and accessible.
- Judgment-free conversations. AI doesn’t judge, interrupt, or stigmatize. This creates a safe space, especially valuable for users navigating shame, fear, or internalized stigma.
- Digital fluency. Having grown up with smartphones and smart assistants, Gen Z is naturally comfortable interacting with bots. Many even prefer text-based support over phone or in-person conversations.
- Loneliness and isolation. The COVID-19 pandemic intensified feelings of disconnection, grief, and anxiety. Although AI bots can’t replace human contact, the sense of companionship and routine they offer can ease emotional strain.
The benefits of AI chatbots for self-help
Beyond affordability, AI chatbots are available 24/7. No appointments, no time zones—just instant support, anytime. This kind of flexibility can be a lifesaver during nighttime anxiety attacks or sudden emotional dips. Other top benefits include:
- Personalization through data. Most platforms use user input and past interactions to create personalized support and guidance. The more you use the app, the more tailored the responses become.
- Clinically informed tools. Many apps use structured self-help modules based on CBT, DBT, ACT, or mindfulness.
- Anonymity. You don’t need to share your real name or story. This anonymity makes it easier for users to speak freely and honestly.
- Consistency. Unlike human therapists who take vacations or change practices, AI is always present and predictable.
The limitations of AI chatbots
But AI for mental health is not a silver bullet. There are important limitations and risks to consider, such as:
- No real empathy. While some AI can mimic empathy, it doesn’t truly understand human suffering. For someone experiencing deep trauma, a bot’s responses may feel hollow or inadequate.
- Limited in crisis situations. AI chatbots aren’t equipped to handle suicidal ideation, psychosis, or abuse. These situations ideally call for human intervention to avoid the possibility of death as a result of the situation. Any chatbot should redirect users in crisis.
- Ethical and legal issues. Some platforms may store or share user data. It’s important that any mental health app comply with regulations like HIPAA in the U.S. or GDPR in Europe. Violations could leave you vulnerable to people who may not have your best interests at heart, so consider how your data might be used before embarking on AI therapy.
- Overreliance and false security. Users might think that AI is “good enough,” rather than seeking help from a human in times of crisis. It’s important to get human help when needed.
- Risk of misdiagnosis or harmful advice. AI is not a licensed clinician. In some cases, this could lead to confusion, delayed care, or even harm.
What the science says
So, are AI chatbots actually effective for self-help? Emerging research suggests they can be, within limits. A study by Dartmouth College found that participants using an AI therapy chatbot showed significant reductions in depression and anxiety. Wysa has also published case studies demonstrating the effectiveness of its tools in reducing symptoms in users.
Who should (and shouldn’t) use AI for mental health?
AI support may be suitable for:
- Individuals with mild symptoms of stress and low mood
- Those exploring self-help strategies before seeking therapy
- Users seeking mood journaling or daily check-ins
- People are looking for interim support while waiting for traditional therapy
AI support is not suitable for:
- Individuals with suicidal thoughts or self-harming behaviors
- People with complex trauma or PTSD
- Users in emotionally or physically abusive environments
- Those with psychosis or other severe psychiatric conditions
Tips for choosing the right AI-guided app
If you’re considering trying AI for your mental health, follow these best practices:
- Choose platforms that are designed for mental health: Don’t just use any AI chatbot. Ensure the app is built in consultation with licensed therapists or backed by peer-reviewed research.
- Review their privacy policies: Look for data encryption, user control over personal data, and compliance with privacy regulations.
- Check for crisis protocols: The app should clearly direct users to appropriate emergency services if needed.
- Look at user reviews: Positive reviews and regular updates indicate active development and a good user experience.
- Start small: Try a free version or a limited trial before committing to a paid plan. Make sure the tone and content resonate with you.
The future of AI in mental health
AI is not a replacement for human therapists, but it can help fill critical gaps in access. As the technology evolves, we can expect to see it take on a more integrated role in mental healthcare.
For example, one likely direction is the rise of hybrid models, which combine AI with human oversight. In these systems, chatbots could handle routine support and check-ins, while human therapists step in for more complex guidance. This kind of blended approach offers the constant availability of AI with the nuance and empathy only a person can provide.
AI tools could get a lot better at picking up on cultural cues. Future versions might adapt to a user’s background, language, and values, making chatbots more inclusive and useful, especially for people for whom traditional care doesn’t help.
AI support will likely become more embedded in everyday life. Expect to see these tools integrated into university counseling programs, employee wellness benefits, and even wearable health devices. With developments in natural language processing and emotional intelligence, future AI may one day be able to detect subtle shifts in tone, speech, or behavior. This could mean more timely interventions and customized support that adjust in real time to the user’s emotional state.
The road ahead
Gen Z’s mental health problems won’t get fixed with quick solutions. Traditional health care is often too expensive or hard to reach. AI-guided self-help offers a scalable option that fits how this generation talks, deals with stress, and builds connections.
It isn’t perfect. AI can’t replace the human warmth of a good therapist, but it can provide immediate relief, build resilience skills, and reduce stigma—all critical first steps on the road to healing. For millions of young people, mental health AI chatbots could mean the difference between suffering in silence and finally being heard.