politics

Misinformation about the vaccine could be worse than disinformation about the elections

A whole new brand of anti-vax

Anti-vaccination movements are not new to the online landscape, and tech platforms have long been grappling with how to handle them. But false claims and conspiracies about Covid vaccines are already looking more difficult to police than those social media companies have had to deal with in the past.

Part of the trouble is that there is limited data about the coronavirus vaccines, making some narratives harder to refute than claims about vaccines that have been around for years — such as that childhood shots cause autism, which repeated studies over years have proven to be untrue. Even debunking unfounded claims about the Covid vaccines involves explaining a vaccine that operates by a new mechanism.

Another is that the outbreak arrived at a time when enormous communities distrusting of government have been growing online.

Taken together, the scientific unknowns and political anxiety have mixed to produce a complex new breed of anti-vax.

Melanie Smith, head of analysis at Graphika, a social media analytics firm that tracks misinformation, said the fringe QAnon movement has gained influence with anti-vax communities online, boosting momentum and pushing unfounded claims about Covid vaccines into the mainstream.

QAnon at its core is an anti-government conspiracy — and we are existing in a time where communication with governments is extremely important, particularly for public health — so you have QAnon turning its attention to vaccinations,” said Smith, who has been studying the intersection of vaccine misinformation and conspiracy theories since the pandemic began. (One of the most popular political conspiracies in the U.S. right now, she noted, is that the vaccines implant a microchip created by Bill Gates for citizen surveillance.)

“So not only do you have these conspiracy theories that QAnon pushes specifically about what the vaccine will do to people, but [it] also engenders this general distrust of the government and institutions that I think is particularly dangerous right now,” Smith said.

Other top Covid anti-vax theories include that the nation’s top infectious disease expert, Dr. Anthony Fauci, is directly profiting from Covid vaccines, and those who choose not to get the shots will be denied food stamps, according to social media analytics firm NewsGuard, which has fact-checked and debunked the claims.

Such claims pose a greater threat than political disinformation around an election, which is usually targeted at one country, but generally doesn’t bleed over into others, Smith said. The pandemic, on the other hand, is an all-encompassing, international danger.

“It’s global and we’re all facing the same crisis where trust in government is historically low, trust in public health institutions is historically low in a lot of countries — particularly Western democracies who are leading the vaccine rollout,” Smith said, adding that bodies like the United Nations and World Health Organization are being consistently undermined by conspiracies and disinformation.

Silicon Valley tries to keep up

Health experts have been warning about the need to prepare for vaccine misinformation since the early days of the pandemic, and the world’s largest social media companies say they are ready with policies and teams to confront it.

But the policies themselves are still evolving even as the first waves of Americans start to receive shots.

Facebook said in early December that it would soon begin removing coronavirus vaccine claims that have been proven false by public health experts. A spokesperson confirmed that Facebook has started implementing the policy but would not share details on the posts, or volume of content, it has already taken action on.

Twitter kicked off efforts on Monday to remove the most dangerous misinformation about Covid vaccines in particular and said it will begin labeling posts with potentially misleading claims about the shots early next year. (Twitter has said previously that it will not take action on every post containing disputed information about the virus, but coordinated conspiracies would be removed.)

Video platforms, which researchers fear are particularly vulnerable, were some of the earliest to institute Covid vaccine-specific policies. YouTube in early October expanded its medical misinformation rules to prohibit Covid vaccine claims that go against consensus from the World Health Organization or local health authorities, and said it has removed Covid anti-vax videos. TikTok introduced a policy almost a year ago prohibiting anti-vax misinformation broadly. The video-sharing app says it removes false information about vaccines and suspends accounts spreading such claims. Last week, TikTok added notices to hashtags like #covidvaccine directing viewers to authoritative information from local health agencies and the World Health Organization.

Even if the policies manage to keep up, experts and lawmakers say loopholes in the rules set so far, as well as inconsistent enforcement, are making the already gargantuan task of vaccine distribution even more precarious.

“The overall enforcement challenge is uniform enforcement,” said Graham Brookie, director of the Atlantic Council’s Digital Forensic Research Lab. He called social media companies’ response “incomplete” — further complicated by logistical challenges thrust by the pandemic, when platforms like Facebook have had to rely more heavily on automation for content moderation. (That has not gone well.)

And one of the early tests of tech platforms’ ability to quash Covid vaccine misinformation does not bode well. In the first months of the pandemic, the “Plandemic” video — which peddled the baseless narrative that individuals involved in vaccine development were doing so for money and power — tore across social media platforms. As the sites scrambled to remove the video, it kept appearing elsewhere online, garnering millions of views across Facebook, YouTube, Twitter and other platforms and gaining traction with followers of the QAnon and anti-vaccination movements in particular. While the original video is now harder to find, users still echo its claims on social media platforms many months later.

The question now is whether the companies have upped their efforts enough to avoid a repeat situation.

“They’ve had a lot of time to perfect systems on this — on detecting, moderating and in some cases, removing, misinformation about vaccines specifically — which means that the expectations for deployment have to be extremely high,” said Brookie. “Because the stakes have never been higher.”

Schiff, who has been outspoken on anti-vaccine misinformation on Facebook, YouTube and Twitter since long before the current crisis, commended the companies for taking “substantial steps” to address the falsehoods but warned that platforms need to do more to prevent vaccine misinformation from circulating widely in “closed online groupings where researchers suggest most antivax content is shared and disseminated.”

Yet Schiff argued that the algorithms at the core of these platforms make it nearly impossible for them to get ahead of misinformation.

“Absent a fundamental overhaul, enforcement will always be stuck playing catch up in a system designed to promote the most engaging content and not the most truthful,” Schiff said.

Far-reaching social media posts about vaccine glitches present one of the biggest threats to the early stages of distribution, according to researchers. Anything that goes wrong — a bad batch of shots, for example, or a small number of adverse reactions — could explode on the internet, which experts say could hinder the country’s recovery and drive disinformation for years.

These are cases “that can online get blown way out of proportion and interfere with a larger effort to use a set of vaccines that are being closely examined by serious regulators and deemed safe and effective,” NYU’s Barrett said.

The marathon ahead

Sustaining vaccine misinformation policies throughout the lengthy rollout is also a major undertaking for the social media companies.

NewsGuard’s co-CEO, Steve Brill, said the drawn-out distribution of vaccines presents an even greater misinformation threat than the country saw during the presidential election, given the race wrapped over several weeks while it could take months or more before the country reaches herd immunity.

“You ain’t seen nothin’ yet,” he said in an interview, estimating the existing hoaxes would multiply tenfold by January. And as the world becomes more familiar with brand names like Pfizer and Moderna, offering “a really specific target to shoot at” in large corporations, new narratives will pick up steam, he added.

“The next three months are really going to be tough because you have the coming of the vaccine, overlaid against who’s making the decisions about who gets [it], overlaid against the fact that the pandemic is worse than it’s ever been,” Brill said. “Mix those three things together, and you just have a recipe for all kinds of distrust.”

Congress is taking notes

Congressional attention on content moderation issues remains high, and the deluge of Covid vaccine misinformation is likely to be a target early next Congress.

Schiff said in an email “I plan to continue engaging directly with the companies” on the issue. The Congressional Task Force on Digital Citizenship proposed its own roadmap for Biden to fight the “infodemic.”

Burgess said it’s Congress that should be taking the lead. He raised alarm about the tech industry’s contested legal liability shield, known as Section 230, which lawmakers on both sides of the aisle have rallied to repeal or amend.

“One of the big loopholes is Section 230,” he said of platforms’ vaccine misinformation policies, calling on Congress, and his committee in particular, to examine the issue in the coming year. “Right now, social media companies really cannot be held liable for the spread of misinformation by third parties under certain interpretations of that law. They could make the knowing spread of misinformation part of their terms of use, but we have seen that they do not always apply their content moderation policies equally.”

Biden has expressed interest in curtailing misinformation on social media, even going so far as to himself call for the repeal of Section 230 this past January. But it remains to be seen how he’ll respond to calls from congressional leaders and other experts to staff his administration with mis- and disinformation personnel and just how far he’ll take a government crusade against the “infodemic.”

NYU’s Barrett argued that the companies themselves have instituted strong policies around vaccine misinformation — they just need to put them into practice more stringently.

“People who get their content taken down will be upset,” he added, “but we’ve got a top-grade, public health crisis that’s underway, and everyone’s got to pitch in. Social media companies have to do the same.”





Source link

Related Articles

Back to top button