Skip to Main Content

On the surface, Facebook’s recent settlement with the Federal Trade Commission says a lot about privacy. But if you use Facebook to share information about health, be warned: The settlement does not include health information as part of its carefully defined “Covered Information,” and releases the company from liability for mistakes it made with patient data.

For patient groups that have submitted complaints about Facebook privacy issues, this adds insult to injury.

The FTC first sued Facebook in 2011, alleging that the company misled consumers about its privacy practices. The resulting FTC ruling, finalized in 2012, required Facebook to obtain consumers’ express consent before sharing their information with third parties, along with other requirements intended to reinforce consumer privacy protections. In 2018, in the wake of the Facebook-Cambridge Analytica breach, the FTC began investigating Facebook again for alleged violations of the 2012 ruling. Last week, the FTC settled that investigation with a $5 billion fine against Facebook and an order intended to establish stricter privacy policies surrounding the company’s handling of sensitive “Covered Information.”


In the settlement, “Covered Information” includes 14 types of protected information, such as name, street address, telephone number, date of birth, and Social Security number. These share a remarkable degree of overlap with the “Protected Health Information” defined in the HIPAA Privacy Rule (2002) to protect patients’ health data, with one key exception: “Health information” is conspicuously absent from the Facebook-FTC settlement. Health data is mentioned just twice in the supporting role of an example (once in the settlement order and once in the Department of Justice complaint). This omission leaves patient groups on Facebook vulnerable to ongoing surveillance and privacy violations by the company.

That’s a problem because Facebook’s interest in moving into the digital health market is no secret. While some patient groups formed organically on the site, many others were strategically cultivated through the company’s marketing strategies. As recently as the 2018 Facebook Developer Conference, Mark Zuckerberg described the Groups feature as a place to connect with others about “a disease you might have.”


Facebook actively promoted this feature through public statements by company leadership on the value of groups for managing opioid abuse, caring for transgender children, drug and alcohol addiction, and more. Patient groups have found these online communities to be powerful resources, but now feel trapped: They fear giving up the support and information they are able to exchange on Facebook but know they are being exploited as they do it.

The new FTC-mandated oversight does not prevent Facebook from collecting health information — it only requires the company to report what it plans to do with the information. There is no enforcement mechanism to prevent Facebook from gathering personal health information or even releasing it.

In theory, the added oversight could mean greater protections for health groups — but only if the company’s board of directors collectively decides to prohibit the company’s collection and monetization of users’ health data. As a dissenting statement from FTC Commissioner Rohit Chopra noted, the proposed settlement “imposes no meaningful changes to the company’s structure or financial incentives,” which led to the privacy violations in the first place.

The special case of patient groups and privacy on Facebook had been brought to the FTC’s attention in December 2018, when one of us (F.T.), health privacy lawyer David Harlow, and patient moderators of several Facebook groups submitted a complaint about privacy breaches in Facebook’s health groups. The complaint alleged that Facebook deceptively solicited patients to use groups to share personal information about their health issues and then marketed the resulting health data goldmine, exposing highly sensitive information to third-party data brokers. The complaint also provided detailed documentation of the privacy loophole that allowed third-party scraping of user data from closed cancer patient groups on Facebook.

The FTC apparently ignored this complaint; it is not referenced anywhere in last week’s ruling.

The new settlement says little about how Facebook will protect the privacy of health-related groups on the site, though the company’s role in collecting health information was acknowledged in two statements. In the Department of Justice’s complaint against Facebook, filed on behalf of the FTC, “membership in health-related and other support groups” was listed as an example of the types of personal information that are shared by users on Facebook. In the FTC’s 20-year settlement order, “health” was listed alongside “financial, biometric, or other similarly sensitive information” as an example of information that, when collected by Facebook, “presents a material risk to the privacy, confidentiality, or Integrity of the Covered Information,” and will therefore be subject to additional review under the new terms.

The FTC seems aware that Facebook continues to invest in health care data, but provides no special rules for how these data should be handled, the way the FTC specified for passwords, facial recognition, and telephone number data.

Like the problems with Facebook’s API and “Apps,” that have been documented in relation to the Cambridge Analytica breach, the heart of the problem with groups is that they allow one Facebook user to make decisions about privacy for other users. For example, a group administrator can install third-party apps to access posts and comments without the other participants’ knowledge. In addition, the complaint documented a security vulnerability that allowed any Facebook user to download the membership lists of closed Facebook groups, including patient groups on sensitive topics who were attempting to protect their privacy through strict inclusion requirements. And under a previous loophole that has now been closed, a user who was not a group administrator could add other users to a group without their permission.

Before filing their complaint with the FTC in December, Trotter, Harlow, and the patient groups voiced their concerns directly to Facebook in an email. The administrator who responded told the group there was no problem and that the system, in Facebook’s view, “worked as intended.” When the group brought its concerns to the FTC, it was told that the FTC could not provide any information about its enforcement activities until the investigation was complete.

The investigation is now complete, and the FTC-Facebook settlement “resolves all consumer-protection claims known by the FTC prior to June 12, 2019” — including the December 2018 complaint stemming from the patient groups. In other words, despite the FTC’s failure to address health data in its ruling, Facebook has been released from liability for those known breaches.

In another dissenting statement, FTC Commissioner Kelly Slaughter noted the gravity of this failure and expressed her skepticism that the terms of the settlement would result in meaningful changes to Facebook’s approach to data and privacy. “I cannot view the order as adequately deterrent without both meaningful limitations on how Facebook collects, uses, and shares data and public transparency regarding Facebook’s data use and order compliance,” she wrote, concluding that her “deepest concern with this order is that its release of Facebook and its officers from legal liability is far too broad.”

It is difficult to see Facebook’s settlement with the FTC as providing any new protection to the privacy of patient groups on the site. Instead, despite detailed documentation of previous and ongoing security breaches, Facebook refused to acknowledge the problem and the FTC ignored patients’ complaints.

Patients do, however, hold some power here. Since patient groups represent Facebook’s primary source of high-value health data, it may be time for a mass migration off the platform. Until tech companies and federal regulators can provide genuine, transparent protections and give patients the voice they deserve in setting privacy policies, sites like Facebook don’t deserve to hear patient voices at all.

Kirsten Ostherr is a media scholar and digital health technology researcher at Rice University. Fred Trotter is a health care data journalist, chief technology officer at CareSet Systems, and an unpaid volunteer for the Light Collective, a nonprofit organization formed by the online patient community to deal with Facebook privacy issues.