Skip to Main Content

The most powerful forms of deception rely more on emotional manipulation and misdirection than outright lies. That’s what I’ve observed in nearly a year of research into the murky world of medical misinformation.

Take the episode of “The Joe Rogan Experience” podcast that prompted music legends like Joni Mitchell and Neil Young to remove their music from Spotify, where Rogan is the platform’s most popular podcaster. The guest on that episode, medical researcher Robert Malone, created a distorted picture of alleged vaccination dangers with a combination of anecdote, cherry-picking, innuendo, and wildly improbable speculation — not deliberate lies.


Whether Malone’s juggling is called misinformation or something else, the resulting confusion can lead people to make fatal decisions to remain unvaccinated. In media interviews and social media posts, unvaccinated people who’d been infected with SARS-CoV-2, the virus that causes Covid-19, and fought for their lives in intensive care urged others not to make the mistake they had made. Some explained to interviewers that they’d been so distracted with hype about side effects that they forgot to think about the danger of the disease.

I began wrestling with the concept of medical misinformation when I proposed it as the topic of a fellowship offered by the Society for Professional Journalists. I’ve talked to various infectious disease experts about confusing rules, fuzzy forecasts, breakthrough cases, and the implications of gain-of-function research. I’ve talked to psychologists and computer scientists examining why people spread misinformation on social media, to historians looking at how we decide what constitutes legitimate science and what constitutes pseudoscience. Going forward, I hope to interview more regular people in different regions about their experiences and how they’ve formed their views.

I’d previously thought of medical misinformation as pseudoscience promoted by those selling alternative remedies, fad diets, or the like. That still exists. But it’s comingled with a larger body of politically motivated misinformation.


Spreading malicious rumors about one’s enemies is something political scientist and evolutionary psychologist Michael Bang Petersen told me is deeply rooted in humanity’s affinity for groups. Social media has now not only polarized people on matters related to the pandemic, it’s provided an easy conduit to spread confusion.

But the public confusion is driven mostly by manipulation, slant, spin, and low-quality information rather than outright lies. Changes in the media landscape encourage such deceptions to flourish at the expense of honest reporting.

While traditional media isn’t perfect, most publications maintain accountability to readers and those written about. Social media works through algorithms set to amplify whatever captures attention, and new policies to fact-check and censor content without any transparency only increase social media companies’ power. But better alternatives exist to fact-checking and censorship in dealing with the array of problems labelled misinformation.

Fact-checking doesn’t lend itself to scientific ideas, since science isn’t a set of facts. Instead, it’s a system of investigation that can be seen in a dynamic interplay of data and theories. And it’s not always easy to draw a distinct line between a legitimate minority opinion and fringe or pseudoscience. Fact-checkers could easily introduce their own political biases and extinguish innovative concepts and diversity of thought. During the pandemic, the word “misinformation” has been hurled around to describe ideas people disagree with for political reasons.

Rogan’s three-hour-long conversation with Malone — which I listened to from start to finish — provided a textbook case in the complexity of misinformation. It also poses a challenge for those urging more public trust in scientists.

Malone is a scientist. His research in the 1980s and 1990s is recognized as legitimate and important. But on the podcast, he deployed the deceptive and manipulative tactics common to those promoting pseudoscience. He’s not the first scientist to do so.

Malone repeatedly cited connections and alleged insider status with the U.S. Department of Defense and the FDA. He claims to have been the inventor of mRNA vaccine technology, though in reality he is acknowledged as being one of many who made contributions.

For all the connections and accolades Malone alleged, I was surprised how few of the well-known members of the infectious disease research community had heard of him, even after all the controversy. Last week at a Covid-19 press conference hosted by Harvard Medical School, the University of Massachusetts, and other collaborators, I asked what people thought of Malone. The response was blank looks and some hurried Google searching.

Rogan’s listeners, however, were led to think they were seeing science’s brightest star. Some may have thought they weren’t quick enough or smart enough to follow Malone’s logic, when in reality the logic was full of holes.

When he did use data, it didn’t clearly support his claims — which included the notion that Covid-19 vaccines had caused premature menopause, and that they might make the disease worse through a mechanism called antibody-dependent enhancement.

There is real data that some women missed periods after they got the vaccine, and I’ve talked with scientists who say in animal studies some of the spike proteins generated by the vaccine can diffuse to different parts of the body — but there’s no evidence they do any harm there. Malone also belabored what he saw as an effort to cover up the importance of the anti-parasitic drug ivermectin, which has been subject to dozens of studies and hasn’t been shown to be effective against Covid-19.

I’ve covered almost every topic Malone brought up in my own podcast, “Follow the Science.” But the scientists I’ve interviewed have given more detailed arguments and drawn different conclusions. In general, scientists making a legitimate effort to promote understanding will show how they’ve drawn a certain inference from sets of data, and will often bring in other lines of evidence such as basic biology or chemistry and plausible mechanisms.

One of my favorite podcast guests is medicinal chemist Derek Lowe, who has used chemistry to help explain why ivermectin probably doesn’t work against viruses, and why vaccines are very unlikely to make Covid-19 worse, even though that has happened in some rare cases with other vaccines.

Twitter “de-platformed” Malone in December. If anything, that attempt at censorship only increased his mystique. And Malone likely wouldn’t have become famous — or infamous, depending on your point of view — if social media hadn’t amplified his most provocative anti-vaccine claims, ones that hadn’t sufficiently impressed mainstream colleagues or science journalists.

Comedian Jon Stewart blamed social media algorithms for our misinformation problem — and a number of researchers agree. Studies by computer scientists have shown that experimental automated accounts get buffeted toward extreme content and polarized bubbles and that the algorithms in charge amplify news items without regard to real importance or accuracy.

People do care about accuracy and they have not given in to a post-truth world, social scientists David Rand of MIT and Gordon Pennycook of Canada’s University of Regina told me in an interview. They collaborated with other colleagues on a study published in Nature showing that people really want to share accurate information but give into the temptation to share juicy bits of gossip they think will please their friends or that make them look good.

Rand himself admitted he gave in to the temptation, sharing a tweet claiming Ted Cruz had declared he’d believe in global warming when Texas froze over. That meme went around when Texas was having unprecedented snowstorms and freezing weather. “It was just too delicious,” Rand told me.

In experiments, he and Pennycook showed that if they simply asked volunteers to rate a headline for accuracy, the prompt improved the accuracy of their social media sharing for the rest of the day.

Even more intriguing was their finding that asking 10 or 12 people to independently rate the accuracy of tweets gave results that agreed with professional fact-checkers’ assessments as well as the fact-checkers agreed with each other. Crowdsourcing can work as fact-checking, but it’s known to work only if individuals within the crowd think independently. Social media encourages the opposite behavior: forming opinions based on what other people are saying.

Instead of using crowdsourcing to flag content for censorship, it could be deployed to elevate the popularity of posts most likely to be accurate. There’s nothing organic today about the way algorithms decide who and what will be popular. Why not change them in a way that encourages accuracy and puts power back into the hands of the people?

The message from Malone’s conversation with Rogan that seems to have gotten the most attention is that people were hypnotized by what he called mass formation psychosis. That resonated — even if nobody knew quite what it was. The world has become so polarized that it’s easy to imagine those on the other side of the left-right divide are experiencing some alternative reality.

People are being manipulated by the sources they count on for information — social media feeds. But we don’t have to shut them down to regain control.

Faye Flam is a science journalist and columnist for Bloomberg Opinion and host of the “Follow the Science” podcast.

Create a display name to comment

This name will appear with your comment

There was an error saving your display name. Please check and try again.