Between you, your therapist, and third-party apps
Imagine that you decide to try therapy. Your job is stressful, your family life unsatisfying, and you want to brainstorm with an impartial listener. The convenience of remote therapy is appealing, so you register for a therapy website. Within an hour of registration, you get a message: "Hi—my name is So-and-so. I'm your therapist and I'm here to help you. Let me know when you want to talk!" An hour later So-and-so follows up: "Just checking in. I'm here for you if you want to talk!"
Except they aren’t. Both messages are sent from the therapist’s profile and hold his signature but were automated messages pushed out by the website. The real therapist may not be available for hours. You are unaware of this deception and use the app’s chat feature to share about your work issues and depression. You finish the message with an exasperated claim that you might just quit. Then two days later you log into Facebook and see an ad for a job search platform. Was this what you expected?
Teletherapy is growing in popularity due to the covid-19 pandemic as well as the convenience of therapy in your pocket on your schedule at a fraction of the cost[1]. The number of mental health apps in app stores is steadily growing, with nearly 20,000 different apps available in 2020. [1]. The pandemic also caused a dramatic increase in first-time mental health app users: the top 20 mental wellness apps in April 2020 had four million first-time downloads [2], while popular therapy app Talkspace saw its user base double between mid-March and May 1, 2020 [2]. The increase in the number of people using mental health and therapy apps can be considered a positive trend as it shows the growing interest in taking care of one’s mental health. Research indicates that remote therapy can be as successful as in-person sessions, but little information is available about the efficacy and impact of mental health apps like Talkspace and BetterHelp [3]. Little research has been done on their impact specifically and the research that has been published has been conducted by the app’s developers [4]. One thing is clear: app-based therapy combines the challenges of both teletherapy and mobile applications but is subject to few of the regulations.
Mental health and therapy apps currently fall outside of the declared jurisdictions of both FDA and HIPAA. The FDA does not regulate therapy apps that allow patients or healthcare providers to interact as they are not considered either medical devices or apps [5]. HIPAA protects sensitive patient health information from being disclosed without the patient’s consent or knowledge, but since mental health and therapy apps are developed outside of traditional healthcare providers, they do not fall within the scope of authority of HIPAA [8]. The Federal Trade Commission (FTC) is the only entity that currently may have jurisdiction to regulate mental health and therapy apps under the clause of protecting consumers from “unfair or deceptive acts or practices” [9] But any action from FTC currently would be a response to existing deceptive apps, rather than a proactive establishment of guidelines. Since the technological opportunities have advanced more rapidly than appropriate regulatory standards [10], the developers of the mental health and therapy apps are left to self-regulate.
The opening story is not farfetched. Mobile therapy app developers introduce a different set of priorities that may not always align with the interests of either the client or best therapeutic practices. Developers of therapy apps do not adhere to the ethical standards of the mental health profession and build applications with auto-scheduled messages that impersonate the therapist or share client information with third-party apps. The developers rarely follow credible treatment processes: less than 5 percent of mental health apps are research-based [11]. A survey of 136 websites offering counseling through computer chat rooms and e-mail revealed low levels of compliance with the ethical standards [12]. These apps work with clients who experience anxiety, depression, and suicidal thoughts but often do not provide information that is essential for their clients’ safety. A recent study indicated that more than two million mental health and therapy apps were downloaded that did not display suicide hotline numbers or else displayed incorrect numbers [13]. The lack of accurate information or reminders to seek assistance from a mental health professional or call 911 can lead to devastating consequences in the event of an emergency if a user is experiencing suicidal ideation or self-injurious behaviors. Under current regulatory regimes, app developers would not be liable [14].
Lack of regulation allows therapy platform developers to create their own standards for handling client care and many platforms choose the default MO of the tech industry – selling data. A therapist selling sensitive health client data would immediately lose their license. No such rules exist for mental health platforms like BetterHelp. [16]. BetterHelp –- the #1 therapy platform on Apple Store [17] promises to make professional therapy “accessible, affordable, and convenient”. However, the affordability of the service is subsidized by the app sharing client data with “vendors and service providers, … analytics and advertising providers, and vendors providing technology services and support, payment processing, and data security.” (BetterHelp, n.d.)
To understand the scale of data selling, two journalists from Jezebel.com used BetterHelp and tracked their data’s movement. They found that Facebook knew “how often [the authors] were going to a session and when we booked our appointments” as well as “what time of day we were going to therapy, our approximate location, and how long we were chatting on the app.” [20] While the actual content of the sessions was not available to Facebook, data on how often and when a user logs into BetterHelp, how many therapy sessions a person attends, and how often they message the therapist – were all provided to Facebook [20]. Sharing this metadata still breaks client confidentiality.
BetterHelp is not the only app selling client data. Minimal to no information is provided by app developers regarding the ways personal information is used and shared with third parties [2]. But a study conducted in 2019 found that more than 80% of top-ranked apps for depression and smoking cessation collected and sent data to Google and Facebook “often without disclosing it in their privacy policies” [2] [21]. Companies like BetterHelp that do disclose the possibility of selling data to third parties, often hide the information somewhere deep in the terms of service agreement, and “the disclosures are buried in the seventh section of the app’s privacy policy” [21]. Additionally, the disclosures are often written in ‘legalese’ and at a high-required reading level, making them difficult to understand for most users [3]. Moreover, most users do not take the time to read through the disclaimers [22]; it is estimated that even if a person were to understand the policies, it would still take an individual 201 hours to read all the privacy policies on websites encountered within a year [16].
Mental health apps are only able to sell personal health information without repercussions because no regulatory body is overseeing the field. The fact that platforms like BetterHelp can create revenue streams from selling personal data should motivate regulatory authorities to catch up and get involved in regulating this fast-moving marketplace [21]. We as consumers should petition the FDA to take three immediate steps in regulating mental health and therapy apps.
1. Update regulations to include apps and platforms that act as liaisons between therapists and clients. This shift in definitions would ensure that mental health platforms are required to keep sensitive information confidential and adhere to HIPAA standards. This would also require most apps to be reviewed and approved by licensed mental health professionals (Whitcomb, 2021).
2. Create a precedent that requires mental health app developers to be accountable for the impact of their products. Companies creating mental health apps should provide resources outlining the evaluation methods or efficacy testing of the apps as well as information about the developers’ credentials and professional affiliations.
3. Mandate transparency about third-party disclosures. Mental health and therapy apps should be required to alert users to the potential collection of private information and resale of data to third parties. Disclaimers should be clear, concise, and in a language the user can understand [23].
The growing popularity of mental health and therapy apps is a potentially encouraging indication that the public is starting to take care of their mental health. These services have the potential to radically improve the lives of many people through democratization and increased availability [16], which makes it even more important to create protections for clients. Lack of oversight in the field is a growing concern as the popularity of the apps continues to grow. The suggestions outlined above are intended to be a starting point for the conversation. Of course, increasing regulation is likely to shut down and push out some of the apps that do not adhere to the new standards. But in an industry where that less than five percent of mental health apps are research-based [11], a bit of trimming could be a good thing.
-
[1] S. Pappas, "Providing care in innovative ways," 1 January 2020. [Online]. Available: https://www.apa.org/monitor/2020/01/cover-trends-innovative-ways.
[2] K. Herzog, "Mental health apps draw wave of new users as experts call for more oversight.," 24 05 2020. [Online]. Available: https://www.cnbc.com/2020/05/24/mental-health-apps-draw-wave-of-users-as-experts-call-for-oversight.html.
[3] J. N. J. T. M. E. Huckvale, "Smartphone apps for the treatment of mental health conditions: status and considerations," Current Opinion in Psychology, pp. 65-70, 2020.
[4] D. B. Marshall, "Clinical or gimmickal: The use and effectiveness of mobile mental health apps for treating anxiety and depression," Aust N Z J Psychiatry, 2019.
[5] FDA, "Examples of Mobile Apps That Are NOT Medical Devices," 24 07 2018. [Online]. Available: https://www.fda.gov/medical-devices/device-software-functions-including-mobile-medical-applications/examples-mobile-apps-are-not-medical-devices.
[6] US Department of Health and Human Services, "Health Insurance Portability and Accountability Act of 1996 (HIPAA)," [Online]. Available: https://www.cdc.gov/phlp/publications/topic/hipaa.html.
[7] N. I. o. H. U.S. Department of Health and Human Services, "Examining oversight of the privacy & security of health data collected by entities not regulated by HIPAA," 2016. [Online]. Available: https://www.healthit.gov/sites/default/files/non-covered_entities_report_june_17_2016.pdf.
[8] G. T. Terry, "Regulating mobile mental health apps," Behavioral Sciences & the Law, 2018.
[9] Federal Trade Commission, .com Disclosures: How to make Effective Disclosures in Digital Advertising, Federal Trade Commission, 2013.
[10] J. Comer, "Introduction to the Special Series: Applying New Technologies to Extend the Scope and Accessibility of Mental Health Care," Cognitive and Behavioral Practice, vol. 22, no. 3, pp. 253-257, August 2015.
[11] R. Clay, "Using apps with your patients," April 2020. [Online]. Available: https://www.apa.org/monitor/2020/04/career-using-apps.
[12] W. R. R. Heinlen, "The Scope of WebCounseling: A survey of Services and Compliance with NBCC Standards for the Ethical Practice of WebCounseling," 23 December 2011. [Online]. Available: https://doi.org/10.1002/j.1556-6678.2003.tb00226.x.
[13] V. G. L. L. E. K. M. S. M. C. J. Martinengo, "Suicide prevention and depression apps’ suicide risk assessment and management: a systematic assessment of adherence to clinical guidelines.," BMC, 2019.
[14] J. Davis, "HHS Clarifies HIPAA Liability Around Third-Party Health Apps," 29 April 2019. [Online]. Available: https://healthitsecurity.com/news/hhs-clarifies-hipaa-liability-around-third-party-health-apps.
[15] CDC, "HIPAA," 14 September 2018. [Online]. Available: https://www.cdc.gov/phlp/publications/topic/hipaa.html.
[16] V. B. Palmer, "Ethical and Safety Concerns Regarding the Use of Mental Health–Related Apps in Counseling: Considerations for Counselors," J Technol Behav Science, 31 August 2020.
[17] Apple Store, "BetterHelp Therapy," [Online]. Available: https://apps.apple.com/us/app/betterhelp-therapy/id995252384.
[18] BetterHelp, "Home," [Online]. Available: https://www.betterhelp.com/.
[19] "How much does therapy cost," [Online]. Available: https://www.goodtherapy.org/blog/faq/how-much-does-therapy-cost.
[20] D. M. Osberg, "The Spooky, Loosely Regulated World of Online Therapy," 19 2 2020. [Online]. Available: https://jezebel.com/the-spooky-loosely-regulated-world-of-online-therapy-1841791137.
[21] P. Hess, "Privacy concerns about mental health apps highlight need for regulation," 2019. [Online]. Available: https://www.spectrumnews.org/news/mental-health-apps-highlight-need-for-regulation.
[22] R. &. K. S. Böhme, "Trained to accept?," Paper presented at the 2403–2406, 2010.
[23] C. M. M. D. Kreitmair KV, "Consent and engagement, security, and authentic living using wearable and mobile health technology.," Nat Biotechnol, pp. 35(7):617-620, 2017.
[24] Veterans Administration, "PTSD: National Center for PTSD," [Online]. Available: https://www.ptsd.va.gov/appvid/mobile/#privacy.
[25] Paul E. Ruskin, M.D., Michele Silver-Aylaian, Ph.D., Mitchel A. Kling, M.D., Susan A. Reed, C.R.N.P., C.N.S., Douglas D. Bradham, Dr.P.H., J. Richard Hebel, Ph.D., David Barrett, M.D., Frederick Knowles III, M.D., and Peter Hauser, M.D., "Treatment Outcomes in Depression: Comparison of Remote Treatment Through Telepsychiatry to In-Person Treatment," American Journal of Psychology , pp. 1471-1476, 1 August 2004.
[26] E. Silver, "Therapy Apps," 12 July 2021. [Online]. Available: https://www.e-counseling.com/therapy-apps/.
[27] BetterHelp, "About Page," [Online]. Available: https://www.betterhelp.com/about/.
[28] TikTok, "TherapyDen," 2021. [Online]. Available: https://www.tiktok.com/@therapyden?referer_url=https%3A%2F%2Fwww.dailydot.com%2F&referer_video_id=7007091547917372677&refer=embed.
[29] C. Wong, "‘We don’t think this is a healthy therapeutic relationship’: Therapist exposes BetterHelp’s problems in viral TikTok," 22 September 2021. [Online]. Available: https://www.dailydot.com/irl/therapist-explains-problems-betterhelp-tiktok/.
[30] BetterHelp, "FAQs," [Online]. Available: https://www.betterhelp.com/faq/.
[31] OptimistMinds, "How Much Does Betterhelp Cost Monthly?(A Complete Guide)," 19 07 2021. [Online]. Available: https://optimistminds.com/how-much-does-betterhelp-cost-monthly/.
[32] BetterHelp, "Privacy Policy," [Online]. Available: https://www.betterhelp.com/privacy/.
[33] C. L. Macdonald AM, "The cost of reading privacy policies," I/S: A Journal of Law and Policy for the Information Society, p. 4(3):1–21, 2008.
[34] Windle, "Understanding informed consent: significant and valuable information.," J Perianesth Nurs, pp. 23(6):430-3., 2008.
[35] K. K. Martinez-Martin N, "Ethical Issues for Direct-to-Consumer Digital Psychotherapy Apps: Addressing Accountability, Data Protection, and Consent.," JMIR Mental Health , p. 5(2):e32., 2018.