Data not for sale – How Wysa is setting a benchmark for privacy in digital mental healthcare


15 min read

By Ramakant Vempati, Co-founder, Wysa

Over the last few years, mental health apps have filled a significant void in traditional therapy, making care more accessible on one’s smartphone. Despite dealing with intimate information about sensitive issues such as depression, anxiety, suicidal thoughts and eating disorders, many mental health apps fall short when it comes to privacy and security.

Some mental health apps focus on serving young people, which is a demographic that shows the highest incidence rates of mental health issues and is also one of the most vulnerable to shortcomings around privacy. Any personal information that they share on these apps often remains vulnerable for misuse and is also open to customised ad targeting, which may or may not be in their best interest.

To understand why health data has become so financially valuable, we need to look at the rise of digital healthcare whose growth has been accompanied by an explosion in health data, which is considered to be valuable for medical research, breakthrough treatments, new drugs and personalised care. In fact, the WHO encourages the sharing of this data for public interest but only with the patient’s consent, and in a manner that protects their privacy and prevents inappropriate use. 

The Covid-19 pandemic has further accelerated this shift toward digital healthcare and demonstrated the importance of high-quality health data. The entry of big tech companies into digital healthcare has further raised alarms over the loss of privacy. In such a situation, it is of utmost importance that digital health platforms take responsibility for the rights of consumers and are transparent about the nature and scale of data sharing.

According to a recent report by Mozilla Foundation, mental health apps are worse than any other app category when it comes to protecting people’s privacy and security. Mozilla investigated 32 mental health and prayer apps and found that 28 of them failed to adequately protect consumer privacy, and 25 of them also failed to meet Mozilla’s minimum security standards such as requiring strong passwords and managing security updates and vulnerabilities. The report noted that Wysa was a rare exception that valued users’ privacy. 

Our approach to privacy is core to what we do, and covers four fundamental principles that build trust and protect our users: 

Principle 1: Help the user stay anonymous 

Principle 2: Be clear in how we communicate our policies

Principle 3: Minimise the data we need to store

Principle 4: Implement best-in-class standards to protect data 

1. Help the user stay anonymous

When we launched Wysa in 2017 on the app stores, we took a big decision: users would be able to use the service without having to disclose their identity. With the passage of time, this decision has helped in many ways. 

People who use mental health apps are looking for a safe space where they can discuss their mental health and wellbeing and receive support. Anonymity is key in building trust during digital therapy to enable people to share personal information while feeling safe and in control. A study from the Education Policy Institute has revealed that anonymity is a major factor in young people choosing online counselling, with many saying that they wanted to discuss their issues without parents or anyone else knowing. 

We believe that anonymity in mental health and well-being creates a disinhibition effect – it helps users speak more freely about their concerns, without any stigma or fear of judgement.  

2. Be clear in how we communicate our policies

Research has also shown that young people are more likely to set personal growth goals during online sessions as compared to face-to-face therapy. Clearly, this can only happen effectively if they have the confidence that their conversations will be private. 

To ensure this, it is important for mental health apps to display their privacy policy upfront, rather than burying it deep inside the app. It should also be explained in simple, jargon-free language. Wysa focuses on transparency to enable the users to make an informed judgement about the app. 

Wysa believes that privacy is not singular in healthcare, but a combination of factors. Since many users may not have the patience to read through the entire policy, we ensure that key messages are displayed upfront. These are also repeated in the terms of service. In addition, privacy notices are accessible from the app store and from in-app settings. We also have a detailed FAQ section to provide answers to important questions. Our privacy policy is rigorously tested to ensure readability. At the moment, it can be read by users as young as 15 years old. Access the Wysa privacy policy.

3. Minimise the data we need to store 

Our privacy policy blends in elements of user safety and security, along with informing users about how we handle and use their data. If users inadvertently share any personal identifiers during their conversations, such as their name, email, location, mobile number, the app irreversibly hashes those identifiers within 24 hours. In fact, all user conversations are strongly encrypted and access is highly restricted. No personal data is ever sold to advertisers. 

Most importantly, users can exercise the right to access and erase their data anytime. Infact, the app provides a “Reset my data” feature in its settings. Clicking on it permanently deletes all your conversations with the app and resets the app to consider you as a completely new user.

Users who look at our privacy policy will notice that Wysa only shares aggregated, de-identified data for analytics and to help improve its AI to provide safe, empathetic and secure conversations. In our experience, it is still possible to do a detailed analysis on efficacy without knowing the identity of the user.

4. Implement best-in-class standards to protect data 

To ensure risk-based data protection and governance, we’ve implemented an organisation-wide Privacy Information Management System (PIMS) aligned to ISO 27701:2019 standards and an Information Security Management System (ISMS) which is aligned to ISO 27001:2013 standards. HIPAA and GDPR privacy, security principles and controls are integral to Wysa’s PIMS, ISMS as well as our software development lifecycle. Our ISMS and PIMS practices are audited every year and have been certified by the BSI group. This has allowed us to operationally achieve privacy and security by design and by default, an approach that is commended as industry leading in the digital health space by the likes of the NHS, and as earlier, independent consumer rights bodies such as Mozilla Foundation. 

With news of data harvesting and leaks on mental health apps raising concerns among consumers, digital mental healthcare providers need to be absolutely transparent about how their data is used. At Wysa, we believe that for digital mental healthcare to expand, it is more important than ever for companies to prioritise user privacy and security. This is core to our mission.

wall street