Skip to main content
Chapter Three

Facebook and the Far-Right Group Problem

IREHR Special Report: FACEBOOK and COVID DENIAL

Chapter Three

Facebook and the Far-Right Group Problem

After the 2016 election fiasco and the scandal around Cambridge Analytica’s Facebook usage, significant changes were afoot at Facebook.[41] In retrospect, the most significant change was the “pivot” away from the company’s focus on an individual’s personal newsfeed towards making Facebook groups a centerpiece of platform activity.

According to Facebook, “Groups provide a place to connect with people who share your interests.”[42] While Facebook Pages are public-facing profiles representing specific organizations that anyone can access to view information, groups are closed silos for organic discussion amongst multiple individuals.

A Feature, Not a Bug

For those concerned about far-right growth, Mark Zuckerberg’s 2017 post outlining the pivot away from the newsfeed to groups entitled “Building Global Community” should have been a tip-off that there was trouble ahead.

Disturbingly, Zuckerberg chose to lift up the Tea Party’s use of the platform as his vision for the future. “Sometimes people must speak out and demonstrate for what they believe is right. From Tahrir Square to the Tea Party — our community organizes these demonstrations using our infrastructure for events and groups,” he wrote.[43]

Zuckerberg was entirely correct that the Tea Party used, and continues to use, Facebook to organize groups and events. However, absent was recognition that the Tea Party built a mass movement grounded in racial resentment and fears of white dispossession–a movement that has done lasting harm to the country’s social fabric. (For more, see Tea Party Nationalism).

Facebook’s Fail Cycle

There is a sordid history of Facebook’s failure to promptly respond to the spread of the far-right on the platform: incubating white nationalist groups, militia recruiting efforts, the use of the platform to build out the Proud Boys, and the rapid spread of QAnon across the platform.

In each case, far-right groups used the platform to attract a critical mass of followers. Facebook was repeatedly warned about the problem. Yet rather than proactively addressing these dangers, Facebook waited until public criticism grew overwhelming. In the case of white nationalists, it wasn’t until after the Charlottesville “Unite the Right” rally where a white nationalist killed Heather Heyer that Facebook took action.

In the wake of Charlottesville, Facebook developed a profoundly flawed internal “policy clarification” on “organized hate groups.” According to materials obtained by Motherboard, the Facebook clarification asserted, “we don’t allow praise, support and representation of white supremacy as an ideology” but stipulated that it does “allow praise, support and representation” for both white nationalism and the euphemistic “white separatism.” [Emphasis in original][44] The Facebook document added, “White nationalism and calling for an exclusively white state is not a violation of our policy unless it explicitly excludes other PCs [protected characteristics].”[45] Unfortunately, such a poor understanding of the problem on the platform invariably led to bad decisions.

Even though the racist, misogynist, violent Proud Boys were also participants in Charlottesville, Facebook took even longer to act regarding their platform use. For nearly two years, Facebook was the central recruiting platform for the Proud Boys. They used Facebook to organize local chapters, schedule events, and vet prospective members.[46]

Despite a two-year record of racism and violence, it wasn’t until the end of October 2018, after a series of high-profile clashes that garnered negative international media attention, that Facebook began removing Facebook groups used by the Proud Boys.[47] By then, the group had two years on the platform to expand into an international far-right street-fighting group. Unfortunately, Facebook’s actions were too little, too late. From the bloody streets of Portland to the insurrectionist-trashed corridors of the Capitol Building in Washington, the ramifications of inaction still reverberate.

Facebook repeated this pattern with followers of the QAnon conspiracy. Though it emerged in 2017 on the Internet’s fringes, Qanon quickly found a home on Facebook. Qanon adherents believe in the baseless internet conspiracy theory that Donald Trump is going to save the world from a “Deep State” cabal of Satan-worshipping Democrats, Hollywood celebrities, and billionaires who run the world while engaging in pedophilia, human trafficking, and the harvesting of a supposedly life-extending chemical from the blood of abused children. Though it has many wild conspiracy cul de sacs, QAnon has roots in much older antisemitic conspiracy theories.

As early as 2018, QAnon was associated with violent acts, including an armed standoff near the Hoover Dam, a kidnapping plot and two kidnappings, and at least one murder.  Several QAnon followers were also arrested for involvement in the January 6 Capitol insurrection.

Eventually, the company established the Counterterrorism and Dangerous Organization Team to operate within the Product Policy organization at Facebook.[48] Unfortunately, as the team’s name suggests, their work appears limited to the small subset of groups that openly advocate or organize violence on the platform. It does not include far-right threats to human rights, democracy, or public health. Much like the white supremacy/white nationalism fiasco, Facebook’s overly narrow definition of the problem made it virtually impossible to solve.

Once again, Facebook chose a half measure to respond to the threat QAnon was posing to the platform. After two and a half years of QAnon devotees burrowing into the platform, in August 2020, Facebook adopted a policy that removed only those QAnon-related accounts that specifically discussed violence. The policy resulted in the removal of more than 790 groups and 100 pages linked to QAnon. It also blocked 300 QAnon hashtags and took down 1,500 advertisements. Facebook also identified an additional 1,950 groups, 440 pages, and more than 10,000 Instagram accounts linked to QAnon that were restricted but not removed.[49] As a result, it did not slow the spread of QAnon.

Just one month before the 2020 elections, Facebook finally announced it was banning all QAnon accounts from its platforms. The relationships and networks were already established by this time, however. Out of those networks came some of the initial “Stop the Steal” push to overturn the November 2020 election (to which Facebook was also slow to respond).

According to a December 2020 report by the Institute for Strategic Dialogue and the nonpartisan news-rating organization NewsGuard, the Facebook efforts again failed. So-called QAnon “superspreaders” continued to be active on Facebook, even after the company banned the conspiracy movement from the platform. “Even after the ban, personal Facebook profiles — many with large followings — continue to discuss and promote the conspiracy on the platform.” The superspreaders, the report noted, “were found to be a key link in the conspiracy chain.”[50]

For over a decade, Facebook was a recruiting ground and a weapons bazaar for far-right paramilitary groups. Militia, Three Percenter, Oath Keeper, “patriot,” and many other groups used the platform for propaganda, gun trading, backwoods paramilitary training, prepping, and plotting.  On August 19, 2020, when Facebook rolled out their “Militarized Social Movements” policy, they announced a crackdown on QAnon, militia groups, and “offline anarchist groups” if they “discuss potential violence.”[51] At the end of September, Brian Fishman, Director of Counterterrorism and Dangerous Organizations at Facebook, announced that the company had identified more than 300 organizations and removed more than 6,500 groups/pages from the platform. Unfortunately, Facebook was not transparent with the results, neglecting to list the groups removed or even provide a breakdown of the various groups’ categories.

Inexplicably, under the same policy, the Facebook team also removed the anti-fascist news and research site, It’s Going Down, and other anti-fascist groups. The move appeared to be an attempt by the company to appear “neutral” and mollify conservative critics.

Long before the pandemic, Facebook was fully aware of the problem of vaccine misinformation on the platform. After YouTube and Pinterest announced plans to deal with anti-vaxxers, Facebook unveiled its first policy to combat misinformation about vaccines in March 2019. Facebook chose an approach similar to how the platform addressed the problem of “fake” news. Rather than remove misinformation, the company aimed to reduce the reach of vaccine misinformation by making it “harder to find.”

The plan relied on leading global health authorities, such as the World Health Organization and the US Centers for Disease Control and Prevention, to publicly identify “verifiable vaccine hoaxes.”[52] Monika Bickert, Facebook vice president of Global Policy Management, wrote, “If these vaccine hoaxes appear on Facebook, we will take action against them.”[53]

As with all Facebook announcements, analyzing the wording of their carefully crafted statement is essential. In this case, the statement hinted that actions would be minimal, nibbling away at the edges of the much larger anti-vaxxer problem infecting the platform.

A few months later, Facebook rolled out the “pop-up” strategy.

Posts, pages, or groups flagged for excessive vaccine misinformation had a pop-up window that asked visitors, “Looking for Vaccine info? When it comes to health, everyone wants reliable, up-to-date information. Learn why the World Health Organization (WHO) recommends vaccinations to prevent many illnesses.” However, it is not clear how effective this method was in deterring anti-vaccine activity on the platform.

Then came the pandemic.

Facebook Groups and COVID Denial

When it came to the pandemic and COVID denialism, Facebook repeated the mistakes of the past. At each turn, Facebook took an overly narrow approach that avoided addressing the root of the problem. As a result, COVID denialism continues to fester on the platform, and the pandemic persists.

In March 2020, as the pandemic was ramping up in the United States, Facebook announced that the platform had taken action. However, it was limited primarily to false claims about COVID cures and misinformation about essential services, outbreak severity, and “misinformation that could cause immediate physical harm.”[54] For everything else, Facebook outsourced the decisions.

“For claims that don’t directly result in physical harm, like conspiracy theories about the origin of the virus, we continue to work with our network of over 55 fact-checking partners covering over 45 languages to debunk these claims,” wrote Facebook’s Vice President of Global Affairs and Communications, Nick Clegg.[55]

For the United States, fact-checking partners consisted of only AFP, the Associated Press, Check Your Fact, The Dispatch, Factcheck.org, Lead Stories, Science Feedback, Reuters Fact Check, and USA Today.[56] No organizations with expertise in misinformation or functionally equipped to analyze how the far-right might weaponize misinformation about the virus were included as partners. Instead, fact-checking partners were tasked to essentially act as referees, calling balls and strikes on misinformation.

Facebook’s fact-checking process concentrated on articles published outside the platform, not on the Facebook user-generated content. In May 2020, for instance, Facebook bragged that independent fact-checkers put warning labels on around 7,500 articles.[57] Yet misinformation was rampant on the platform.

The focus on “misinformation” is, itself, misleading. It places an emphasis exclusively on the spread of false information, as though misinformation acts by itself in a vacuum. It tends to obscure the human element of both individual and collective action. The misinformation framework also shifts focus to a true/false binary, thereby eliminating discussion of the role of ideology or conspiracy theories. Finally, it sidesteps crucial political questions about how the platform may be hindering efforts to stop the spread of a pandemic.

A misinformation framework centers on false information that is spread, regardless of intent, to mislead. Collective action, grounded in misinformation or conspiracy theories, poses an even more significant challenge. There is little evidence that misinformation itself can create a social movement. Instead, far-right movements create conspiracies and misinformation to create mythology and further movement aims.

Public health officials have noted that accurate information is vital to the public, particularly during the early stages of a pandemic. Facebook’s position during this early period was to tweak rather than address the problem head-on. Even when fact-checkers determined that something was misinformation, Facebook made a choice not to remove it. According to Facebook’s Nick Clegg, “Once a post is rated false by a fact-checker, we reduce its distribution so fewer people see it, and we show strong warning labels and notifications to people who still come across it, try to share it or already have.”[58]

Clegg’s announcement did not mention the growing number of groups popping up on the platform dedicated to COVID denialism. These groups relied on the platform to find followers and organize COVID denial events. By April 2020, IREHR identified over 200 COVID denialist groups with over 1.2 million members. Once individuals joined these groups, the limited policies Facebook put in place became almost meaningless.

The structure of these Facebook groups also short-circuited Facebook’s attempts at promoting accurate information about the pandemic. The reinforcing conspiracy-think of many in these groups makes it difficult for any challenging information to get through, something compounded by these forums fostering an insulated and “trusted” community of the like-minded.

Facebook groups are virtually impenetrable to Facebook’s own COVID misinformation efforts. Facebook’s approach centered on cutting off outside misinformation from entering the platform. Facebook groups stood the misinformation dynamic on its head. This time, the COVID misinformation is coming from inside the platform.

Facebook groups create an intimacy of connection between individuals, more like a peer-to-peer messaging app than a newsfeed. Indeed, Facebook promoted Groups as intimate, trusted, private spaces that create community.[59] Unfortunately, COVID denial groups are weaponizing these trusted spaces.

Rather than some outside article serving as a reliable source of information, this closeness of contact creates trust in other group members. As a result, people receive and share information directly to and from others who have become their closest contacts and their “trusted” sources of information—these cognitive bubbles silo users in a never-ending deluge of misinformation. Groups create an uninterrupted feedback loop, continually reinforcing misinformation and increasing radicalization.

As Nina Jankowicz, a disinformation fellow at the Wilson Center, and Cindy Otis, a senior fellow at the Atlantic Council’s Digital Forensic Research Lab, noted, “despite the company’s recent efforts to crack down on misinformation related to COVID-19, the Groups feature continues to serve as a vector for lies.”[60]

For COVID denial Facebook group members, there are a plethora of spaces where individuals can go to have an entire community agree with them that, for instance, masks are the worst-ever assault on freedom. In these spaces, there is little criticism or dissent. Instead, declarations of such sentiment get repeatedly liked, cheered, and shared. There is also often a dynamic in play where the sentiment builds upon itself. So a statement that “masks are terrible” is one-upped by a reply that “masks are slavery,” which gets a response that “masks are like the Holocaust,” and so on. There are no brakes on the Facebook group radicalization train.

On top of this, Facebook tweaked the platform to “increase engagement” and get more users into groups. “Related Discussions” push material from other COVID denial groups into users’ news feeds, exposing them to more COVID denial groups and additional sources of misinformation. In addition, Facebook’s recommendation engines, “suggested groups,” and the site’s search results further drive people to more militant groups they might not otherwise find.

The Private Group Problem

On Facebook, there are two different types of groups. Public groups are available for anyone to join, content from the group is visible to anyone on the platform, and content appears in Facebook searches.

Private groups, on the other hand, are shielded from the public eye. Users have to be let into a private group by an administrator. Vetting questions are often asked of prospective members to screen out people who might challenge the prevailing wisdom inside the group. Content from private groups is inaccessible to non-group-members. These settings make tracking the full extent of far-right activity in COVID denial Facebook groups more complicated.

Taking it to the Streets

Zuckerberg’s earlier Tea Party comment is also an important reminder that what happens on Facebook doesn’t stay on Facebook. There are real-world consequences when far-right movements are incubated on Facebook and unleashed on the world. Groups also make it easy to organize events, rapidly moving COVID denialism off the platform and into the real world. From attacks on hospitals and doctors to threats against teachers and school board members to organized efforts to stop policies to curtail COVID-19, Facebook groups are helping create a toxic public health environment.  (More on these real-world impacts in section three).

At the peak during the first wave of Facebook COVID denial groups, IREHR tracked 1186 groups with 3,032,085 members.  These Facebook groups were where large COVID denial rallies were organized, including the armed storming of state capitol buildings, hanging elected officials in effigy, even threats to kidnap and murder public officials.

Gaming the system

When it comes to the problem of far-right infestation of the platform, in many ways, Facebook is mired in static thinking in a dynamic world. Far-right groups are constantly evolving and adapting to circumstances.

In physics, there is a concept known as the “observer effect” (often confused with the Heisenberg uncertainty principle) which explains that measurements of certain systems cannot be made without affecting the system, that is, without changing something in the system. In the case of Facebook, external events and internal efforts to counteract misinformation have changed the COVID denialist system on the platform.

This problem is not new for Facebook. For instance, last summer, two men met on Facebook in a far-right paramilitary Boogaloo group. The two are accused of murdering a federal officer in Oakland, and one of the men is also charged with the killing of a Santa Cruz sheriff’s officer. Facebook was widely criticized for being slow to react.[61] In response, Facebook banned the term “Boogaloo.” However, it took all of one day for Boogaloo activists to adapt, choosing new and different keywords to game the system.

This pattern is also playing out among COVID denial groups. Several have changed group names to avoid tripping Facebook’s automated systems. Others have changed names to reflect the changing sentiment of the group. For instance, Floridians Against Excessive Quarantine changed the group’s name to Patriot Floridians for FREEDOM over fear! How to avoid a Facebook strike is a common topic in COVID denial groups on Facebook, particularly those in the anti-vaxx and anti-mask wings.

Warning Labels Return

As part of the platform’s Community Standards Enforcement Report, Facebook announced that between April and June 2021, it removed 20 million posts that contained COVID-19 misinformation.[62] The platform also said that warning labels had been added to more than 190 million COVID-19-related posts.

Guy Rosen, a Vice President for Integrity at Facebook, wrote back in April 2020, “When people saw those warning labels, 95% of the time they did not go on to view the original content.”[63] Data to support these claims has not been made public. It’s also worth noting that Rosen’s claims were early in the pandemic. Facebook recently reported it added warning labels to more than 190 million COVID-19-related posts since the start of the pandemic. By Facebook’s metrics, those numbers suggest the strategy doesn’t seem to be deterring the spread of misinformation, either.

Warning labels may work for the vaccine-hesitant, but they do nothing for the ideologically hardened activists who have been fighting against COVID-19 health and safety measures for over a year. For individuals already drawn into the expanding world of COVID denial Facebook groups, the warning labels on COVID misinformation seem to be about as successful as Parental Advisory stickers were in deterring kids listening to hip hop in the 90s. Not only does the label appear to attract immediate attention to the labeled piece, but it also fuels the COVID denialist victim complex. Members of these groups will often angrily lash out at Facebook (on Facebook) for “unfairly” targeting them or that Facebook is part of the conspiracy to keep information hidden.

Missing the Ecosystem for the Forest

As the Delta variant continues to infect the nation and the pandemic enters the fourth wave, the data suggests that COVID denial activity on Facebook is entering a second wave as well.

On August 18, Facebook responded to a Center for Countering Digital Hate (CCDH) report entitled “The Disinformation Dozen: Why Platforms Must Act on the Twelve Leading Online Anti-Vaxxers.” On the same day, at least ten new COVID denial groups were created on Facebook, according to IREHR data.

The CCDH study did a thorough job of looking at a sample of anti-vaxx activity in a brief time window. The report found that the twelve leading anti-vaxxers account for up to 73% of Facebook’s anti-vaxx content in the sample of groups and pages it studied.[64]

Monika Birkert, Vice President of Content Policy for Facebook, responded,

“Focusing on these 12 individuals misses the forest for the trees. We have worked closely with leading health organizations since January 2020 to identify and remove COVID-19 misinformation that could contribute to a risk of someone spreading or contracting the virus. Since the beginning of the pandemic across our entire platform, we have removed over 3,000 accounts, Pages and groups for repeatedly violating our rules against spreading COVID-19 and vaccine misinformation and removed more than 20 million pieces of content for breaking these rules.”[65]

Birkert correctly identified that by focusing on the “Disinformation Dozen,” the CCDH study misses the larger context of what is happening on the platform and thereby “misses the forest for the trees.” However, in many ways, in defending itself, Facebook’s response misses the ecosystem for the forest.

Firstly, the disinformation dozen accounts are only one part of the more significant anti-vaxx problem on Facebook. Moreover, the anti-vaxx problem is only one part of the larger COVID denial picture. An entire networked ecosystem of groups is built upon the platform, with a diversity of themes, promoting collective action on COVID denial and making it harder to get the pandemic under control.

Secondly, Birkert’s brag that Facebook “removed over 3,000 accounts, Pages, and groups for repeatedly violating our rules against spreading COVID-19 and vaccine misinformation” is an unsubstantiated claim that turns out to be numerical sleight of hand. Combined, 3,000 is a tiny number when lumping together individual accounts, groups, and pages. For context, last year, Facebook reported over 221 million active users in the United States. There are also millions of groups and pages. Also worth noting, the report does not indicate if the 3,000 figure is in the United States or if it is a worldwide figure.

When it comes specifically to addressing the COVID denial problem on Facebook in the United States, there is no indication of how that figure breaks down by components. For example, eliminating a few thousand groups and a few individuals would be more significant than kicking off a few thousand people and a handful of groups.

Whatever the breakdown, the overall number is a drop in the denialist bucket. Last year, IREHR identified 1,186 COVID denial Facebook groups alone, with a combined membership of over three million people. IREHR’s list has been publicly available on the organization’s website as far back as April 2020. Moreover, the data has been open to the company. Yet as of August 2021, 822 of the 1,186, or sixty-nine percent, of the groups IREHR identified in August 2020 are still active on Facebook. Additionally, another 910 COVID denial Facebook groups were created in the last year, 269 new groups in August 2021 alone.

By this metric, very little of the COVID denial group activity appears to be negatively impacted by Facebook’s misinformation efforts. Nor have Facebook’s recent attempts to respond to “inauthentic activity”—fake accounts, bots, and the like, spreading disinformation impacted the activity or growth of these groups. If the efforts are to be taken at face value, then COVID denial groups are a) filled with “authentic” people, not bots or some sort of astroturfing operation, and b) that the policy of the platform appears okay with COVID denial (though a lack of transparency makes it impossible to know).

Previous

Two: The Data

Next

Four: COVID Denial Radicalization on Facebook

NOTES

[41] Lomas, Natasha. “How Facebook has reacted since the data misuse scandal broke.” Tech Crunch. April 10, 2018. https://techcrunch.com/2018/04/10/how-facebook-has-reacted-since-the-data-misuse-scandal-broke/.

[42] Facebook. “Harness the Power of Groups to Build Community.” Facebook for Business website. Undated. https://www.facebook.com/business/learn/lessons/use-groups-build-community.

[43] Zuckerberg, Mark. “Building Global Community.” Facebook. July 17, 2017.https://www.facebook.com/notes/3707971095882612/ .

[44] Cox, Joseph. “These Are Facebook’s Policies for Moderating White Supremacy and Hate.” Motherboard. May 29, 2018.  https://www.vice.com/en/article/mbk7ky/leaked-facebook-neo-nazi-policies-white-supremacy-nationalism-separatism.

[45] Hatmaker, Taylor. “Facebook’s policy on white supremacy plays right into a racist agenda.” Tech Crunch. May 29, 2018.  https://techcrunch.com/2018/05/29/facebooks-white-nationalism-white-supremacy-policy-motherboard/ .

[46] Hatmaker, Taylor. “Facebook is the recruiting tool of choice for far-right group the Proud Boys.” Tech Crunch. August 10, 2018. https://techcrunch.com/2018/08/10/proud-boys-facebook-mcinnes/.

[47] Hatmaker, Taylor. “Facebook bans the Proud Boys, cutting the group off from its main recruitment platform.” Tech Crunch. October 30, 2018, https://techcrunch.com/2018/10/30/facebook-proud-boys-mcinnes-kicked-off/.

[48] Patel, Faiza and Dwyer, Mary Pat. “Facebook’s New Dangerous Individuals and Organizations Policy Brings More Questions Than Answers.” Just Security website. July 20, 2021. https://www.justsecurity.org/77503/facebooks-new-dangerous-individuals-and-organizations-policy-brings-more-questions-than-answers/.

[49] Wong, Julia Carrie. “Facebook restricts more than 10,000 QAnon and US militia groups.” The Guardian. August 19, 2021. https://www.theguardian.com/us-news/2020/aug/19/facebook-qanon-us-militia-groups-restrictions.

[50] Frenkel, Sheera. “QAnon is still spreading on Facebook, despite a ban.” The New York Times. December 18, 2020. https://www.nytimes.com/2020/12/18/technology/qanon-is-still-spreading-on-facebook-despite-a-ban.html.

[51] Bell, K. “Facebook has removed more than 6,500 militia groups and pages.” Engadget. October 1, 2020. https://www.engadget.com/facebook-removed-6500-militia-groups-pages-002343444.html.

[52] Bickert, Monika. “Combatting Vaccine Misinformation.: Facebook Newsroom. March 7, 2019 updated September 4, 2019. https://about.fb.com/news/2019/03/combatting-vaccine-misinformation/.

[53] Ibid.

[54] Clegg, Nick. “Combating COVID-19 Misinformation Across Our Apps.” Facebook Newsroom. March 25, 2020. https://about.fb.com/news/2020/03/combating-covid-19-misinformation/.

[55] Ibid.

[56] Facebook. “Where we have Fact-Checking.” Facebook Journalism Project. Undated. https://www.facebook.com/journalismproject/programs/third-party-fact-checking/partner-map.

[57] Rosen, Guy. “An Update on Our Work to Keep People Informed and Limit Misinformation About COVID-19 .” Facebook Newsroom. April 16, 2020 updateed May 26, 2021. https://about.fb.com/news/2020/04/covid-19-misinfo-update/.

[58] Ibid.

[59] Zuckerberg, Mark. “A Privacy-Focused Vision for Social Networking.” Facebook. https://www.facebook.com/notes/2420600258234172/

[60] Jankowicz, Nina and Otis, Cindy. “Facebook Groups Are Destroying America.” WIRED. June 17, 2020.  https://www.wired.com/story/facebook-groups-are-destroying-america/

[61] Kukura, Joe. “Far-Right Boogaloo Movement Evaded Facebook Ban In Pretty Much One Single Day.” SFist. August 12, 2020. https://sfist.com/2020/08/12/untitled-far-right-boogaloo-movement-evaded-facebook-ban-in-pretty-much-one-single-day/.

[62] Nix, Naomi and Wagner, Kurt. “Facebook Removed 20 Million Pieces of Covid-19 Misinformation.” Bloomberg News. August 18, 2021. https://www.bloomberg.com/news/articles/2021-08-18/facebook-removed-20-million-pieces-of-covid-19-misinformation.

[63] Rosen, Guy. “An Update on Our Work to Keep People Informed and Limit Misinformation About COVID-19.” Facebook Newsroom. April 16, 2020, updated May 26, 2021. https://about.fb.com/news/2020/04/covid-19-misinfo-update/.

[64] “We collected this sample by analyzing anti-vaccine posts containing URL links from 10 private and 20 public anti-vaccine Facebook Groups between 1 February and 16 March 2021. Groups in this sample have between 2,500 and 235,000 members and generate up to 10,000 posts per month.” Center for Countering Digital Hate. “The Disinformation Dozen: Why Platforms Must Act on Twelve Leading Online Anti-Vaxxers.” March 24, 2021. https://252f2edd-1c8b-49f5-9bb2-cb57bb47e4ba.filesusr.com/ugd/f4d9b9_b7cedc0553604720b7137f8663366ee5.pdf.

[65] Bickert, Monika. “How We’re Taking Action Against Vaccine Misinformation Superspreaders.” Facebook Newsroom. August 18, 2021. https://about.fb.com/news/2021/08/taking-action-against-vaccine-misinformation-superspreaders/.

 

Facebook and COVID Denial

How the platform is fueling another wave of far-right activism, insurrectionism, and violence, and what we can do to stop it.
A Special Report of the Institute for Research and Education on Human Rights

Copyright © 2021. Institute for Research & Education on Human Rights.