From individuals to employers, there’s growing interest in using digital services to help people work through mental health issues. It’s a solid approach, given how difficult and costly it can be to find and work with an in-person therapist. Yet digital therapeutics currently represent a kind of “Wild West.” Gatekeepers need to be more methodical in how they assess these products.
Amid a strong backlash against tech, with congressional scrutiny on topics ranging from privacy to online content moderation, many tech companies are now asking people to trust them in new and more intimate ways as they move into health care. Some in digital health have called for change from the inside, but little is being said about how those outside the digital health industry can help ensure that the users of these new technologies are protected from the risks that digital health presents them.
Without outside oversight, some companies will take shortcuts. I saw the risks and dangers of this at the company I co-founded, Modern Health, which were recently reported by The Information. Some startups seem willing to risk patient safety and provide inadequate care in hopes of getting rich quick.
I believe these problems can be solved, but not by relying on tech companies promising to police themselves. Here are key solutions that don’t rely on startups regulating themselves, which few people are talking about.
App stores must enforce standards
At the highest level, review processes in the major app stores need to be strengthened. Huge numbers of apps in the Google and Apple stores have dodged Food and Drug Administration regulations that aim to ensure patient safety by listing themselves in the “health and fitness” category of app stores, which does not require FDA review, instead of identifying themselves as “medical” apps, which the FDA does oversee.
By avoiding more rigorous review, health and fitness apps can often cause problems such as health privacy breaches — or worse. A 2019 study of apps for suicide prevention, for example, found that suicide helpline phone numbers were often inaccurately listed, or weren’t listed at all. Those apps had already been downloaded more than 2 million times. It’s not hard to imagine that lives might have been lost.
The Apple and Google business models for mobile devices let these companies earn substantial commissions from each app purchased. So these two gigantic vendors should also bear the responsibility to categorize apps correctly. Startups will never volunteer for more rigorous app store review processes, so the gatekeepers — Apple and Google — must perform that job in addition to profiting from it.
Business, HR leaders face serious new risks
Oversight of digital health apps should also come from the human resources departments of companies that make decisions about what kind of digital health services to purchase and provide as benefits to their employees. Many digital health apps, while aimed at individual consumers as end users, have business models built around large and small companies buying apps for their workers. This market is currently estimated at an eye-popping $20 billion.
When I worked at Modern Health, companies’ human resources leaders would buy the service for their employees, and I saw many of those employees become patients of ours. In the early days, I was often the first person patients would meet, and I would help them connect with care before I built systems to take over this work. Many were facing difficult challenges; some were suicidal. Later, I built systems to route patients to appropriate care. Because some workers have serious mental health problems, human resources teams need to up their games and employ rigorous scrutiny when it comes to selecting digital health services.
Until now, choosing health insurance has been a relatively low-risk choice because the offline health care system is heavily regulated by the government to ensure safety. Medical treatments begin in research labs that are vetted by ethics boards, tested in clinical trials, and finally approved by the FDA. In the digital world, a new health platform might be cooked up in someone’s garage and put online with no external review at all. To be sure, some of these types of apps throw around terms like “evidence based,” but this is often meaningless.
It is essential — though challenging — for HR teams to make smart choices because old methods of due diligence don’t work. Many companies just want to know if a digital health service will be easily adopted and used by a lot of employees, but that’s like hoping employees need to keep going back to the doctor over and over. How often employees use a service isn’t the same thing as getting healthy, and high usage is often driven by incredibly light treatments like emailing users articles about self-treatment or offering them meditation courses that don’t address core health issues. In fact, these light but high-engagement treatments can even hinder the real care people need to recover quickly, and may even discourage them from seeking proper treatment.
While HR departments’ existing methods of evaluation — where they exist — don’t work well for digital health, the scientific and medical communities have a solution: peer-reviewed research published in established science journals that test an app in question. Taking even a quick look at such research papers, which quality companies can provide to potential buyers, offers some assurance that the service being bought does what it claims.
Corporate buyers should reach out to doctors or other health experts — not those affiliated with the app being evaluated — and ask for help interpreting independent analyses. How an app performs once it has been deployed into the workforce should be treated the same way. The main metric shouldn’t be how many people use an app but how well it works.
Just as doctors alter their practices as medicine changes, HR departments need to evaluate new research and respond accordingly when deciding to keep a digital health service or make a switch.
Give patients a protected voice
It can be scary for some individuals to post reviews about an app that essentially acknowledge they have a serious mental health problem that the app has or hasn’t made better — or made worse. The same holds true for those with diabetes, or cancer, or most any other condition.
But this feedback is sorely needed. That’s why Apple and Google, as app store owners, along with digital health app developers and HR teams, need to build better feedback mechanisms that protect users’ privacy while still giving them a voice.
Reviews and complaints also need to stay permanent. Apple’s app store currently allows tech companies to wipe out all reviews every time new versions of their apps are released. That may make sense when bug fixes or new features are added to simple apps. But when an app is connected to real-world care, digital therapeutic companies should not be allowed to delete serious medical failures just because they changed the color of a couple buttons.
It should be a bright red flag if an app has far fewer ratings than one of its similar-sized competitors.
We must work to get it right
Health care and, even more so, mental health care, are challenging to get right.
It’s difficult to be critical of mission-driven digital therapeutic companies, especially when so many people are suffering. That said, it’s exactly because of the need for better physical and mental health care that we as a society must collectively work to get it right. Ultimately, the public and tech are all on the same side.
Most tech companies want to do the right thing, just as much as everyone else hopes they will. Society stands to benefit tremendously from innovation in tech, especially health tech. But the stakes are rising, and these companies need to act like it.
Erica Johnson is an advisor to several digital health and wellness companies and a co-founder of Modern Health.
Create a display name to comment
This name will appear with your comment