One man’s personal experience with Wysa

CATEGORY:
5 min read

Navigating Mental Health in a Modern World

Wysa launched eight years ago with a simple mission: to provide accessible, compassionate mental health support to people when they need it most. What began as an idea has since grown into a globally recognized AI-driven mental health platform that helps millions worldwide. Last week, journalist Nicholas Fearn published an independent review of Wysa in a raw and deeply personal piece in The Metro titled, “I Spent a Month with an AI Therapist – This is What Happened.” This review couldn’t have come at a more significant time.

For a company dedicated to improving mental health support through AI, an independent review like this is more than a test of technology—it’s a test of trust.

When Nicholas asked to independently review Wysa for The Metro, I felt a mix of excitement and anxiety. This review was different. Nicholas wasn’t reviewing Wysa as a theoretical product test—he was using it in real life, during deeply vulnerable moments. And we had no direct contact with him throughout the process. We couldn’t intervene if something went wrong or if he misunderstood a feature. He used the app exactly as any user would—unprompted, unfiltered, and unaided.

In his article, Nicholas shared his struggles with mental health, including lifelong battles with anxiety, OCD, and even suicidal thoughts. It was eye-opening to see how Wysa interacted with him during some of these dark moments. At one point, he admits,

“Crying my eyes out, I told Wysa I didn’t want to be alive anymore. Its response was utterly heartwarming: ‘Nic, you are worth life. You are loved, cherished, and cared for, even though you may not feel that way right now.’”

Reading this was deeply moving because it encapsulates everything we strive for—creating an AI that doesn’t just respond to users, but offers genuine empathy and comfort during moments when they might not have anyone else to turn to. In fact, over time, 438 people have reached out to us with similar stories of how Wysa has been there for them in their darkest hours—moments where they felt their lives were on the line. Wysa is not intended as a suicide prevention tool, yet time and time again, we hear that it has helped avert the worst in moments of crisis.

Real-Life Context: The Ultimate Test

One of the most nerve-wracking aspects of this review was that Nicholas wasn’t just “testing” Wysa in a controlled environment. He was using it during real, raw moments of emotional distress. As he wrote,

“After admitting I never seem to sleep at a regular time due to my anxiety, Wysa suggested a thought-reframing exercise… and while the connection cut out mid-conversation, it was clear to me that I was overthinking.”

This highlights both Wysa’s strengths and its areas for improvement. Nicholas wasn’t just engaging in a short test or quick demo—he was relying on Wysa during difficult, anxious nights, when sleep evaded him and his thoughts spiraled.

As a team, we couldn’t predict how Wysa would perform in these unpredictable, highly personal circumstances. We couldn’t fix anything mid-review, clarify features, or provide support. The independence of this review was crucial because it pushed Wysa to operate exactly as it would for any user. It was the real world testing our product in ways no laboratory or clinical trial can replicate.

In Nicholas’ own words,

“I never thought I’d be telling my insecurities to a robot, but Wysa was incredibly empathetic, asking thoughtful questions and offering exercises that helped me manage.”

His experience reinforces our core belief that AI, when carefully designed and tested, can play a critical role in mental health care.

The Importance of Independent Validation

The significance of Nicholas’ review is also a reminder of the importance of independent validation. As a company, we spend months refining features, testing safety protocols, and seeking feedback, but there’s something different about handing your product over to someone who has complete freedom to use it as they need.

In his review, Nicholas talks about his battle with social anxiety, writing,

“While I’m okay seeing family and friends, the thought of encountering neighbours frightens me. The advice Wysa gave—just a smile or hello—was surprisingly simple and helpful. It showed me that I didn’t need to engage in long conversations, just small steps.”

This resonates because it shows that even the simplest AI suggestions can have a profound impact when given at the right moment.

A Step Forward

As I look back on Nicholas’ review and the countless hours our team has poured into Wysa, I feel both proud and motivated. The stakes are high when dealing with people’s mental health, and independent reviews like Nicholas’ are the real test of whether we’re meeting that challenge.

When Nicholas concluded his review by saying, “The bot’s words were a comfort to me,” it was a reminder of why we do what we do. As we move forward, we aim to ensure that people from all corners of the world, no matter what language they speak, can find that same comfort, support, and safety. More to come soon, with open-source discoveries.

Image: Nicholas Fearn

Please take a moment to reach Nicholas’ full review in The Metro.

Sarah Baldry

Chief Marketing Officer at Wysa
bbc
wntrepreneur
orcha
wall street
bloomberg