Skip to main contentSkip to navigationSkip to navigation
vaccine vials
Misinformation experts say Facebook’s efforts to fight vaccine falsehoods have been too little, too late. Photograph: Dado Ruvić/Reuters
Misinformation experts say Facebook’s efforts to fight vaccine falsehoods have been too little, too late. Photograph: Dado Ruvić/Reuters

Misinformation 'superspreaders': Covid vaccine falsehoods still thriving on Facebook and Instagram

This article is more than 3 years old

Researchers say big Facebook accounts still condemn vaccines while anti-vaxxers banned from Facebook have fled to Instagram

Conspiracy theories and misinformation about the coronavirus vaccine are still spreading on Facebook and Instagram, more than a month after Facebook pledged it would take them down.

Under pressure to contain an avalanche of falsehoods, Facebook announced on 3 December that it would ban debunked claims about the safety and efficacy of vaccines now being distributed worldwide. The company said it removed more than 12m pieces of content from Facebook and Instagram between March and October, and that it worked with factcheckers to place labels on 167 million more pieces of content over the same period.

But researchers say that big Facebook accounts, some with more than half a million followers and long histories of promoting falsehoods, are still openly churning out new posts questioning the vaccine. Meanwhile, prominent anti-vaxxers who have been banned from Facebook are continuing to spread misinformation to hundreds of thousands of people on Instagram, which Facebook owns.

The social network says it has limited the reach of some prominent anti-vaxx Facebook pages, and that few people are seeing some of the latest coronavirus misinformation. But misinformation experts say the platform’s actions amount to far too little, too late.

In a December report, the Center for Countering Digital Hate (CCDH), which has tracked the rapid growth of the anti-vaccine movement during the pandemic, argued that it was past time for tech platforms to take more aggressive action.

“Anything less than the dismantling of these individuals’ profiles, pages and groups and permanent denial of service, now they know what is happening, is willing acquiescence.”

Same misinformation, different platform

Even before the coronavirus pandemic, the World Health Organization had labeled “vaccine hesitancy” – the reluctance to get vaccines even when they are available – as one of the top 10 threats to global health.

Experts say the past year has brought a troubling escalation of an anti-vaccine movement that had already flourished on social media, where anti-vaxx activists had used private Facebook groups to convince mothers not to vaccinate their children, and to coordinate social media harassment campaigns against doctors who explained the medical benefits of vaccines.

Major anti-vaccine accounts on social media platforms have gained more than 10 million new followers since 2019, including 4 million additional followers on Instagram and 1 million on Facebook, according to an analysis by the CCDH.

The Pfizer and Moderna vaccines authorized for distribution in the United States this December each went through a series of rigorous clinical trials. More than 15,000 people received each vaccine in the final phase of these tests, and both vaccines were found to be more than 90% effective at preventing coronavirus, and with no serious safety concerns. Since the vaccines began to be distributed, there have been a few cases of recipients having allergic reactions to the vaccine, but those incidents were not serious. All potential adverse reactions to the vaccines are being closely monitored, part of an intense, ongoing protocol for making sure new vaccines are safe.

But the fast timeline and intense political pressure to produce a coronavirus vaccine have left people around the world questioning whether they should trust the new vaccines and looking for honest answers – a situation anti-vaxx groups were well-prepared to exploit.

A nurse administers a Pfizer/Biontech Covid-19 vaccine in Belgium. Photograph: Valentin Bianchi/AP

In October, leading anti-vaccine activists held a private online conference to strategize on how to use the public’s fears during the coronavirus pandemic to spread skepticism about vaccines, according to CCDH, which documented the conference speeches and conversations in a December report. At the conference, Del Bigtree, a prominent US anti-vaxx activist, summarized a three-point strategy for undermining public faith: “It’s dangerous. You don’t need it. And herd immunity is your friend,” he said, according to the report.

In July, YouTube had removed Bigtree’s channel, which reportedly had more than 15m views, and in November, Facebook took down Bigtree’s Facebook page, which had more than 350,000 followers, for repeatedly posting Covid misinformation, according to a Facebook spokesperson.

But Bigtree is still operating an Instagram account with more than 212,000 followers, where he posts videos that regularly receive between 30,000 and 150,000 views.

One video on BigTree’s account compares the CDC’s suggestion that people might wear “vaccinated for Covid-19” stickers to Nazi Germany, while others cherry-pick headlines and anecdotes to cast doubt on the efficacy of the vaccine and how quickly it is being rolled out.

“We don’t know what kind of mutated viral experience is happening inside the person that’s gone and gotten the vaccine,” Bigtreee says in one Instagram video, suggesting that if he saw someone he knew had received the vaccine in public, he would “cross to the other side of the street”.

None of Bigtree’s recent Instagram videos have factchecking labels.

Researchers say that some of the most powerful anti-vaccine messaging operates by selectively presenting real data and anecdotes that foster doubt, rather than sharing explicitly false claims.

“The trick with vaccine hesitancy: it’s not always misinformation. It’s not always things that are demonstrably untrue. It’s stuff that makes you question and doubt,” said Kolina Koltai, a postdoctoral fellow at the University of Washington’s Center for an Informed Public who has studied anti-vaccine activists on social media since 2015.

Other activists identified by researchers as “super-spreaders” of coronavirus vaccine misinformation also had their Facebook accounts removed this year, but continue to operate Instagram accounts with hundreds of thousands of followers, including the British conspiracy theorist David Icke and American anti-vaxx activist Sherri Tenpenny.

“A lot of the accounts that were removed from the Facebook platform remain active on Instagram, with enormous follower counts,” Anna-Sophia Harling, the head of Europe for NewsGuard, a company that rates the accuracy and trustworthiness of news websites, and that also produces public reports on social media vaccine misinformation, said. “Instagram has a huge Covid-19 vaccine problem.”

Asked about why Bigtree, Tenpenny and Icke were still using Instagram after violating Facebook’s policies, a Facebook spokesperson said that the Instagram accounts had not yet violated Instagram’s policies enough times to be taken down, and that Facebook violations do not count towards the removal of Instagram accounts, even when the same people are operating the accounts.

‘Super-spreaders’ of misinformation

While platforms like Pinterest have long implemented strict no-tolerance policies for anti-vaxx propaganda, Facebook has long refused to ban all forms of anti-vaccine activism. The company has argued, as it has done previously with issues like Holocaust denial, that banning false claims which exist elsewhere on the internet is fruitless, and that claims about vaccines should remain on the platform to be debated and factchecked.

But as the death toll from coronavirus soared in 2020, and Facebook’s platform became a recruiting and organizing tool for protests against public health measures and even new US domestic terrorist groups, the company began to take more aggressive action to crack down on misinformation linked to real-world destruction.

Facebook said in early December that its new policy was born out of concern that Covid vaccine misinformation could lead to “imminent physical harm”, and pledged to remove claims from Facebook and Instagram that experts had identified as false.

The company says it is also continuing to limit the reach of groups and pages that spread anti-vaxx misinformation, so that fewer people encounter the content, and touts its efforts to connect users with authoritative information on Covid-19 from health officials, citing that over 600 million people have clicked on pop-ups on Facebook and Instagram to learn more from official sources.

Researchers say false claims are still easy to find on Facebook. Photograph: Geoff Smith/Alamy Stock Photo

But a month after Facebook launched its aggressive new policy, researchers who study anti-vaccine activism say that false claims are still easy to find, and that many posts with misinformation do not have any additional warning labels.

In late November, researchers NewGuard, the company that rates the quality of news sites, identified 14 large public English-language Facebook pages as “super-spreaders” of coronavirus vaccine misinformation. Twelve of those Facebook pages were still active in late December, said John Gregory, NewGuard’s deputy editor for health news. He added that the majority of individual vaccine misinformation posts flagged in that November report are also still live on the site, without any factchecking label.

A Facebook spokesperson said that all of the pages flagged in the NewsGuard report were already facing consequences for posting material repeatedly flagged by Facebook’s factcheckers. The distribution of their posts into Facebook’s news feed had been dramatically reduced, meaning that fewer people would see them, the spokesperson said, and the pages were no longer being recommended to people who did not already follow them.

NewsGuard researchers had noted that Worldtruth.TV, a Facebook page with 1.5 million followers, repeatedly shared false claims about vaccines more than 100 times over the summer, including that they would use “microchips” as part of a global tracking system and would “alter” human DNA.

The spread of these kinds of conspiracy theories appears to be having real-world consequences. In Wisconsin, a pharmacist told police he had tried to destroy hundreds of doses of coronavirus vaccine because he believed the shots would mutate people’s DNA, according to court documents released on Monday.

Throughout December, even after Facebook’s new policy announcement, Worldtruth.TV continued to post memes and conspiracy theories about coronavirus. Some referred to the vaccine as the “mark of the beast”, a reference to the biblical book of Revelation, and a widespread conspiracy theory that has been promoted most prominently by Kanye West. Several of the posts were explicitly antisemitic.

A Facebook spokesperson said that the post referring to the coronavirus vaccine as the “mark of the beast” did not violate company policies, and that it was also not eligible for factchecking. The spokesperson added that Facebook was only removing claims about the coronavirus vaccine that had officially been debunked by health authorities, and that this was an evolving process.

NewsGuard’s health editor also flagged continuing coronavirus false claims on GreenMedInfo, a Facebook page with more than 500,000 followers that has repeatedly been linked to health misinformation. In early December, the page was the first to publish a story falsely claiming that “Pfizer’s vaccine had killed two people in the vaccine trial”, said John Gregory, NewGuard’s deputy editor for health news. In fact, those two deaths had not been linked to the vaccine.

While posts on both Worldtruth.tv and GreenMedInfo have received relatively little engagement – a sign, Facebook says, that its efforts to limit distribution are working – researchers say it is frustrating to see a continuing tide of falsehoods from the same “bad actors” who have been at work since the pandemic began. “These are not new actors in the misinformation space,” says Gregory. “They didn’t pop up yesterday.”

Instead of playing “whack-a-mole” with each new false claim, Gregory says Facebook should take proactive action against accounts based on their history of pushing lies.

“You don’t have to wait for them to publish another vaccine claim that will take a few days for a responsible journalist to address, and then slap a factchecking label on it,” he says “You know what these pages are going to do beforehand.”

Most viewed

Most viewed