Skip to Main Content

New falsehoods have emerged to match every stage of pandemic response. Unquestionably, that misinformation has been amplified on social media platforms, as the world locked down in waves and citizens looked online for answers. 

On Thursday, leaders from Facebook and YouTube joined a panel at the global conference of the Health Information Management Systems Society (HIMSS) to discuss the threat misleading information poses to the pandemic response. But even as vaccine misinformation continues to hamper vaccination efforts in the United States, the conversation failed to address the impact of falsehoods on these platforms head-on — instead, focusing on platforms’ efforts to proactively share accurate, trusted Covid-19 information. 

During the talk, YouTube’s director and global head of health care and public health Garth Graham claimed the platform has removed more than 800,000 pieces of content “that were misaligned in terms of science.” Similarly, Kang-Xing Jin, head of health at Facebook, said the company had removed over 18 million pieces of content on Facebook and Instagram that had been “debunked by public health experts and could also cause harm.”  


But when asked about their efforts to quell misinformation, both Graham and Jin put a far stronger emphasis on promoting reliable information. Graham used the metaphor of a garden, likening content removal to pulling out the weeds. “You’ve got to replace it with engaging things that people are looking for, because a lot of the time, people are searching for answers, and may be gravitating to the wrong things. You’ve got to make sure you have the garden supplied with engaging kinds of information.” 

Jin echoed the message, saying that “at the end of the day, people fundamentally want what’s best for themselves and their families, and it’s understandable that you might have some questions. And one of the best ways of addressing these issues is actually just helping people get those questions answered directly. Just removing this information alone isn’t going to address that need.” 


Jin pointed to efforts on Facebook and WhatsApp to promote vaccination through location-finding services, informational campaigns led by the World Health Organization, and features that let users easily express their support of vaccination. Graham referenced YouTube’s collaboration with influencers to spread the messages of trusted health experts.  

The issues skirted during the panel mirror the challenges the White House has reportedly encountered with Facebook as it called on social media platforms to reduce misinformation. 

This week, the New York Times described a series of meetings throughout the first half of the year in which the White House became increasingly frustrated with Facebook’s approach to addressing misinformation on its platforms, culminating in President Biden’s assertion, later walked back, that Facebook was “killing people.” 

A central tension in those meetings, many of which were reportedly attended by Jin, was that Facebook claimed it was unable to share detailed information on the ways misinformation is viewed and spread on the platform, the Times reported. 

Aggressively policing misinformation — information that is inaccurate, regardless of the intent when it’s shared — would require platforms to crack down on people who don’t know that what they’re sharing is wrong. In a rapidly evolving pandemic, they could be sharing old information, a fact they’ve misunderstood, or something that’s been colored by intense fear. Disinformation, which is intentionally shared to mislead, is somewhat easier to patrol. But both can have a dramatic impact on decision-making. 

Platforms have made attempts to define and cut down on potentially harmful content. Facebook’s first policies addressing coronavirus misinformation came at the end of January 2020. But it was forced to issue multiple updates throughout the year, reacting as misinformation spread not just through users’ posts, but in ads, pages, and groups. Also on the panel with Graham and Jin was Darius Walker, senior executive producer for CBSN, CBS News’ 24-hour streaming network, who described the network’s reporting on those policies as they evolved.

But misinformation continues to sprout more quickly than it can be weeded out. More proactive efforts — not just to promote accurate information, but to anticipate the spreading of falsehoods — should be a goal, said Hans Kluge, Europe region director of the WHO, in an interview with STAT’s Casey Ross at HIMSS this week.

“My experience is that once people get emotional, evidence doesn’t help anymore,” said Kluge. “So we have to be a step ahead.” He pointed to WHO’s Early AI-supported Response with Social listening tool, which is mining public information as a way to get ahead of misinformation before it spreads widely online and elsewhere.

“Gone are the days of people expecting to get information from a flyer or a billboard,” said Graham. “These are vehicles now people are using to make life or death decisions.”  

Create a display name to comment

This name will appear with your comment