By Ramakant Vempati, Co-founder, Wysa
Over the last few years, mental health apps have filled a significant void in traditional therapy, making care more accessible on one’s smartphone. Despite dealing with intimate information about sensitive issues such as depression, anxiety, suicidal thoughts and eating disorders, many mental health apps fall short when it comes to privacy and security.
Some mental health apps focus on serving young people, which is a demographic that shows the highest incidence rates of mental health issues and is also one of the most vulnerable to shortcomings around privacy. Any personal information that they share on these apps often remains vulnerable for misuse and is also open to customised ad targeting, which may or may not be in their best interest.
To understand why health data has become so financially valuable, we need to look at the rise of digital healthcare whose growth has been accompanied by an explosion in health data, which is considered to be valuable for medical research, breakthrough treatments, new drugs and personalised care. In fact, the WHO encourages the sharing of this data for public interest but only with the patient’s consent, and in a manner that protects their privacy and prevents inappropriate use.
The Covid-19 pandemic has further accelerated this shift toward digital healthcare and demonstrated the importance of high-quality health data. The entry of big tech companies into digital healthcare has further raised alarms over the loss of privacy. In such a situation, it is of utmost importance that digital health platforms take responsibility for the rights of consumers and are transparent about the nature and scale of data sharing.
According to a recent report by Mozilla Foundation, mental health apps are worse than any other app category when it comes to protecting people’s privacy and security. Mozilla investigated 32 mental health and prayer apps and found that 28 of them failed to adequately protect consumer privacy, and 25 of them also failed to meet Mozilla’s minimum security standards such as requiring strong passwords and managing security updates and vulnerabilities. The report noted that Wysa was a rare exception that valued users’ privacy.
Our approach to privacy is core to what we do, and covers four fundamental principles that build trust and protect our users:
Principle 1: Help the user stay anonymous
Principle 2: Be clear in how we communicate our policies
Principle 3: Minimise the data we need to store
Principle 4: Implement best-in-class standards to protect data
1. Help the user stay anonymous
When we launched Wysa in 2017 on the app stores, we took a big decision: users would be able to use the service without having to disclose their identity. With the passage of time, this decision has helped in many ways.
People who use mental health apps are looking for a safe space where they can discuss their mental health and wellbeing and receive support. Anonymity is key in building trust during digital therapy to enable people to share personal information while feeling safe and in control. A study from the Education Policy Institute has revealed that anonymity is a major factor in young people choosing online counselling, with many saying that they wanted to discuss their issues without parents or anyone else knowing.
We believe that anonymity in mental health and well-being creates a disinhibition effect – it helps users speak more freely about their concerns, without any stigma or fear of judgement.
2. Be clear in how we communicate our policies
Research has also shown that young people are more likely to set personal growth goals during online sessions as compared to face-to-face therapy. Clearly, this can only happen effectively if they have the confidence that their conversations will be private.
3. Minimise the data we need to store
Most importantly, users can exercise the right to access and erase their data anytime. Infact, the app provides a “Reset my data” feature in its settings. Clicking on it permanently deletes all your conversations with the app and resets the app to consider you as a completely new user.
4. Implement best-in-class standards to protect data
To ensure risk-based data protection and governance, we’ve implemented an organisation-wide Privacy Information Management System (PIMS) aligned to ISO 27701:2019 standards and an Information Security Management System (ISMS) which is aligned to ISO 27001:2013 standards. HIPAA and GDPR privacy, security principles and controls are integral to Wysa’s PIMS, ISMS as well as our software development lifecycle. Our ISMS and PIMS practices are audited every year and have been certified by the BSI group. This has allowed us to operationally achieve privacy and security by design and by default, an approach that is commended as industry leading in the digital health space by the likes of the NHS, and as earlier, independent consumer rights bodies such as Mozilla Foundation.
With news of data harvesting and leaks on mental health apps raising concerns among consumers, digital mental healthcare providers need to be absolutely transparent about how their data is used. At Wysa, we believe that for digital mental healthcare to expand, it is more important than ever for companies to prioritise user privacy and security. This is core to our mission.