Mental health apps are under scrutiny – and it's about time
Dr. Liza Jachens
15 February 2024
Content
- Digital mental health is under fire
- How Unmind addresses ethical concerns
- 1. Do no harm: How do we protect our users from harm?
- 2. Confidentiality: How do we keep your data and privacy safe?
- 3. Autonomy: How do our users make informed decisions?
- 4. Helpfulness: How do we know the platform is effective?
- 5. Accessibility: How do we make our platform equitable?
- Conclusion
The shift towards digital mental health care has significantly benefited many, but it has also raised important ethical questions. Unmind is dedicated to leading the way in ethical practices, ensuring our services uphold privacy and efficacy, as well as actively contributing to users' wellbeing.
Digital mental health is under fire
There are growing concerns surrounding digital mental health platforms like BetterHelp. These platforms, which became increasingly relevant during the pandemic, aim to provide easier access to therapy and mental health services. However, they have been scrutinized for how they handle sensitive user data.
BetterHelp, specifically, was fined $7.8 million by the US Federal Trade Commission for misleading consumers about the privacy of their data. This incident reflects broader concerns in the mental health app industry, where privacy violations are common. The Mozilla Foundation found that many of these apps fail to protect user privacy, often monetizing personal mental health struggles.
The article also highlights the limitations of current regulations. In the US, for instance, HIPAA does not always protect communications on digital platforms. The lack of regulation has led to a proliferation of apps with unproven therapeutic benefits, risking potential harm to users. For example, a study involving a digital mental health tool for at-risk patients resulted in increased self-harm among users.
The US and UK have begun exploring ways to regulate these tools, focusing on safety and efficacy. There's a call for more transparency in how these apps operate and market themselves. The overarching concern is that while these apps could significantly aid mental healthcare, they require strict regulation and oversight to ensure they are safe and effective.
How Unmind addresses ethical concerns
At Unmind, we're acutely aware of the ethical implications of digital mental health services. Our approach is grounded in the core ethical principles of do no harm, confidentiality, autonomy, helpfulness, and accessibility. Below is a snapshot of how we apply these principles. These align with the World Health Organisation’s principles on Ethics & Governance of Artificial Intelligence for Health, which includes a series of guidelines designed to ensure that the governance of artificial intelligence in healthcare fully realizes the potential of the technology, while holding builders accountable.
1. Do no harm: How do we protect our users from harm?
Mental health apps must be designed and tested to ensure they do no harm. This involves rigorous validation of the app’s therapeutic content, safeguarding against inaccurate information or advice, and preventing features that could inadvertently exacerbate users’ mental health issues.
Scientific reviews of apps for mental health have called for further validation studies to help improve the quality and reliability of interventions. We know that sometimes well-intentioned interventions or apps can be ineffective or even harmful for some people.
WHAT WE DO
✔ In our research we report on % of users who got worse to capture negative effects and develop our products responsibly.
✔ We have implemented a robust reporting system within the app where users can report concerns or adverse effects directly to developers for quick action.
✔ We seek to identify and mitigate any features that could potentially cause harm, such as triggering content, and ensure we follow the evidence base.
✔ We also provide prolonged access to the platform following an employee's departure. For some, transitioning out of a business can be a difficult time so we ensure there is a degree of continuity for them through that period.
2. Confidentiality: How do we keep your data and privacy safe?
Mental health platforms or apps often handle sensitive personal information, making confidentiality paramount. Users’ data should be encrypted, securely stored, and shared only with explicit consent for specified purposes, such as improving the app or direct care. Providers must be transparent about data handling practices and comply with relevant data protection laws.
WHAT WE DO
✔ We employ industry-standard encryption methods for storing and transmitting user data to protect against unauthorized access.
✔ We are transparent with users about data handling practices by clearly explaining what data is collected, why, and who has access to it, in a straightforward and understandable way.
✔ ️ We offer users the option to opt-out of research and marketing notifications, without losing access to the app’s main features.
3. Autonomy: How do our users make informed decisions?
Mental health apps should support user autonomy by providing clear, accessible information about the app’s functions, data usage, and the nature of any therapeutic interventions offered. Users should be able to make informed decisions about whether to use the app, select preferences for how their data is used, and opt in or out of specific features without coercion.
WHAT WE DO
✔ We have implemented interactive consent processes where users can easily understand the scope of the app’s functions, the nature of the data collected, and how it will be used.
4. Helpfulness: How do we know the platform is effective?
The development of digital health platforms must prioritize user benefit, aiming to enhance wellbeing, provide effective support or treatment options, and improve users' quality of life. Developers should use evidence-based practices where possible and ensure the content is updated according to the latest mental health research to maximize user benefits.
WHAT WE DO
✔ We conduct research that is peer reviewed and published in academic journals to assess efficacy and validity of our tools. See our publications.
✔ We use measures to track both wellbeing and helpfulness to evaluate the impact of our interventions. We share this with our users and use this to recommend appropriate content or services.
✔ We incorporate evidence-based practices into the app’s design, features and content, ensuring it has a strong research foundation.
✔ We regularly update the app based on the latest mental health research and user feedback to improve effectiveness and usefulness.
✔ We have an in-house science team that consists of clinical, organizational, and counseling psychologists who develop content, interventions and safeguards in line with best evidence-based practices.
5. Accessibility: How do we make our platform equitable?
Mental health apps should have content that is appropriate and accessible for diverse populations. This involves considering the specific needs of different groups to provide appropriate support or resources.
WHAT WE DO
✔ We conduct user research to better understand our global audience. This feedback informs our content, ensuring it inclusively reflects our diverse users. We recently published an in-depth scientific study of the international validity of our Wellbeing Tracker for UK/ANZ/US territories.
✔ Our content and events, emphasizing diversity and inclusivity, cover everything from intersectionality and neurodiversity to life stages and events like menopause and parenting.
✔ We aim for at least 50% BIPOC representation in new releases and feature inspirational quotes from diverse voices, ensuring all users feel seen and supported. Our Help Center features a variety of resources specifically for BIPOC populations in the UK and US.
✔ Our platform is accessible in the world's most widely spoken languages including English, Chinese, and Latin American Spanish.
Conclusion
The critique of mental health platforms and apps is mostly targeted at ethical considerations. We believe in the power of technology in this space when done right and, therefore, we welcome the closer examination of platforms.
As Unmind grows, our commitment to ethical integrity remains unwavering. By embedding the principles of non-maleficence, confidentiality, autonomy, helpfulness, and accessibility into our operations, we aim not only to lead by example but to elevate the standards of digital mental health care.
As we move forward, Unmind invites feedback and dialogue from our community, reinforcing our dedication to improving mental health support in an ethical, effective, and inclusive way.