Concerned about the number of unvaccinated COVID-19 patients showing up at his hospital, the French doctor logged on to Facebook and uploaded a video urging people to get vaccinated.
It was soon overrun with dozens, then hundreds, then more than 1,000 hate messages from an anti-vaccine extremist group known as V_V. The group, active in France and Italy, harassed doctors and public health officials, vandalized government offices and tried to disrupt vaccination clinics.
Alarmed by the abuse of its platform, Facebook removed several accounts linked to the group last December. But that hasn’t stopped V_V, which continues to use Facebook and other platforms and, like many anti-vaccine groups around the world, has expanded its portfolio to include climate change denial and anti-democratic messages.
“We are going to look for them at home, they do not have to sleep anymore,” reads a publication of the group. “Fight with us!” read another.
The largely unrestrained nature of attacks on the vaccine’s indisputable health benefits highlights the clear limits of a social media company in thwarting even the most destructive kind of misinformation, particularly without a sustained aggressive effort.
Researchers at Reset, a UK-based nonprofit, identified more than 15,000 abusive or misinformation-laden Facebook posts from V_V, activity that peaked in the spring of 2022, months after the platform will announce its actions against the organization. In a report on V_V’s activities, Reset researchers concluded that his continued presence on Facebook raises “questions about the efficacy and consistency of Meta’s self-reported intervention.”
Facebook’s parent company noted in response that its 2021 actions were never aimed at removing all V_V content, but rather removing accounts found to be engaging in coordinated harassment. After The Associated Press notified Facebook of the group’s continued activities on its platform, it said it removed an additional 100 accounts this week.
Meta said it is trying to strike a balance between removing content from groups like V_V that clearly violate anti-harassment rules or dangerous misinformation, without silencing innocent users. That can be particularly difficult when it comes to the controversial topic of vaccines.
“This is a highly contentious space and our efforts continue: Since our initial dismantling, we have taken numerous measures against attempts by this network to come back,” a Meta spokesperson told the AP.
V_V is also active on Twitter, where Reset researchers found hundreds of accounts and thousands of posts from the group. Many of the accounts were created shortly after Facebook cracked down on the program last winter, Reset found.
In response to the Reset report, Twitter said it took enforcement action against several accounts linked to V_V, but did not detail those actions.
V_V has proven especially resistant to efforts to stop it. Named for the movie “V for Vendetta,” in which a lone, masked man seeks revenge against an authoritarian government, the group uses fake accounts to evade detection, often coordinating their messages and activities on platforms like Telegram that they lack more aggressive Facebook. moderation policies.
That adaptability is one reason the group has been difficult to stop, according to Jack Stubbs, a researcher at Graphika, a data analytics firm that has tracked V_V’s activities.
“They understand how the Internet works,” Stubbs said.
Graphika estimated the group’s membership to be 20,000 as of the end of 2021, with a smaller core of members involved in its online harassment efforts. In addition to Italy and France, the Graphika team found evidence that V_V is trying to create chapters in Spain, the UK, Ireland, Brazil and Germany, where a similar anti-government movement known as Querdenken is active.
Groups and movements like V_V and Querdenken have increasingly alarmed law enforcement and extremism researchers who say there is evidence far-right groups are using skepticism about COVID-19 and vaccines to expand your reach.
Increasingly, these groups are moving from online bullying to real-world action.
For example, in April, V_V used Telegram to announce plans to pay a €10,000 reward to vandals who spray-painted the group’s symbol (two red V’s in a circle) on public buildings or vaccination clinics. The group then used Telegram to spread photos of the vandalism.
A month before Facebook cracked down on V_V, Italian police raided the homes of 17 anti-vaccine activists who had used Telegram to threaten government, medical and media figures for their apparent support of COVID-19 restrictions. .
Social media companies have struggled to respond to a wave of misinformation about vaccines since the start of the COVID-19 pandemic. Earlier this week, Facebook and Instagram suspended Children’s Health Defense, an influential anti-vaccine organization run by Robert Kennedy Jr.
One reason is the complicated balancing act between moderating harmful content and protecting free expression, according to New York University’s Joshua Tucker, who co-directs the NYU Center for Social Media and Policy and is a senior adviser. of Kroll, a technology, government and economics company. consulting company.
Getting the balance right is especially important because social media has become a key source of news and information around the world. Leave too much bad content and users may be misinformed. Delete too much and users will start to distrust the platform.
“It’s dangerous for society that we’re moving in a direction where no one feels like they can trust the information,” Tucker said.