Last week, a political action committee called the American Principles Project unveiled a new video on Twitter falsely claiming that Democratic presidential nominee Joseph R. Biden Jr. supported sex changes for 8-year-olds.
Since Friday, a similar video has also appeared on Facebook as many as 100,000 times — primarily in Michigan, a swing state in the Nov. 3 election.
What has been harder to pinpoint is how widely the video has been spreading through text messages.
Though companies like Facebook and Twitter have developed tools for tracking and policing disinformation on their social networks, texting activity is largely a free-for-all that receives little scrutiny from tech companies and government regulators.
“There is no way to audit this,” said Jacob Gursky, a research associate at the University of Texas at Austin. “Organizations are just collecting cellphone numbers from data brokers and mass-texting people.”
The video circulated in Michigan, Wisconsin and Pennsylvania as part of a coordinated texting campaign, according to a study by researchers at the University of Texas at Austin. Over the weekend, it reached a reporter who covers online disinformation for the news site Protocol. The reporter had a Pennsylvania cellphone number.
Twisting the meaning of Mr. Biden’s statements during a recent “town hall” event — which condemned discrimination against children who identify as transgender but did not address sex changes — the campaign was a high-profile example of increasingly widespread efforts to distribute disinformation through text messages.
“During a recent town hall, Joe Biden endorsed giving 8- to 10-year-olds sex change treatments,” the texts read. “This is way too extreme for me. I can’t support him.”
The texts tracked by Mr. Gursky and his fellow researchers said they were sent by the American Principles Project, but they referred to the organization only as “the APP PAC.” The texts purport to arrive from a “Democratic volunteer.”
The American Principles Project did not respond to a request for comment.
Data on texting campaigns is hard to come by. But Robokiller, a company that blocks automated phone calls and texts, said Americans received 2.6 billion political text messages in September, a 400 percent increase since June. The company estimated that since June, Republication-affiliated organizations have sent roughly six times more messages than their Democratic counterparts.
The Texas researchers said texting campaigns are in part a reaction to increased scrutiny on social media services. As Facebook and Twitter have pushed disinformation networks off their services, the networks have resurfaced on private texting apps like Signal, Telegram and WhatsApp, where they can continue operate without being monitored.
Private disinformation networks are prevalent in places like India and Mexico, the researchers said. But they are becoming more common in certain parts of the United States, such as southern Florida, where apps like WhatsApp are popular.
Facebook said on Tuesday that it had removed ads from both the Trump and Biden presidential campaigns that arguably could mislead voters in states where early voting has not started.
The ads were bought by the campaigns over the weekend, as part of a last-minute push to secure Facebook ads before the end of Monday. Facebook recently said it would not accept any new political ads in the week before Election Day, but would continue to run ads that had been bought ahead of time.
The Trump and Biden campaigns did not immediately respond to requests for comment.
Megan Clasen, a Biden campaign media adviser, tweeted that Facebook had told her office that it could not run ads that urged people to vote by saying that “Election Day is tomorrow” or “Election Day is today.” She then pointed to a similar ad by the Trump campaign that said, “Election Day is today.”
Several hours after journalists and Biden campaign officials contacted Facebook, the Trump campaign ad was removed. Facebook said the ads were misleading because they could be seen by voters in states where voting was not currently open.
“As we made clear in our public communications and directly to campaigns, we prohibit ads that say ‘Vote Today’ without additional context or clarity,” a Facebook spokesman said.
Facebook had previously said it would not fact-check political ads. But the company said it would remove advertisements that could mislead voters or provide incorrect information on how to vote.
This has been, by any measure, a bad year for consensus reality.
First, there was President Trump’s impeachment — a divisive and emotionally charged proceeding that unleashed a torrent of lies, exaggerations and viral innuendo.
Then came the Covid-19 pandemic — an even bigger opportunity for cranks, conspiracy theorists and wishful thinkers to divide us along epistemic lines, into those who believed the experts and those who preferred to “do their own research.”
The Black Lives Matter protests this summer were a feeding frenzy for those looking to distort and reframe the narrative about police violence and racial justice.
And while election years are always busy times for fact-checkers, Mr. Trump’s fusillade of falsehoods about voter fraud, Spygate and Hunter Biden’s emails this year has resulted in a bigger challenge for those charged with separating truth from fiction.
Zignal Labs, a firm that tracks online misinformation, analyzed which major news topics in 2020 were most likely to generate misinformation. Its data, which draws from sources including social media apps like Facebook, Twitter, Instagram and Reddit, as well as newspapers and broadcast TV transcripts, isn’t an exact accounting of every single piece of misinformation out there. But it’s a rough gauge of which topics are most frequently used as vehicles for misinformation, by those looking to inject confusion and chaos into media narratives.
(Quick methodological note: These “misinformation mentions” are limited to topics related to either the election or the Covid-19 pandemic, and are calculated by Zignal’s automated system based on the number of mentions of a given term along with a term that is frequently associated with misinformation. So, for example, a post that mentions vaccines in the context of Covid-19 would not be counted as a misinformation mention, but a post that mentions vaccines along with a hashtag like #FauciTheFraud or a name like Bill Gates — a frequent target of anti-vaccine activists — would be counted, even if the underlying story was debunking such a false claim.)
The topic most likely to generate misinformation this year, according to Zignal, was an old standby: George Soros, the liberal financier who has featured prominently in right-wing conspiracy theories for years.
Out of 2.6 million total media mentions of Mr. Soros so far this year, nearly half (1.1 million) were accompanied by terms (“Soros-funded,” “bankroll”) that suggested that he played a role in funding left-wing agitators. They peaked this summer, as false claims that Mr. Soros had funded Black Lives Matter protests went viral following the killing of George Floyd.
Second on the list was Ukraine, which peaked as a misinformation topic in January and February, during Mr. Trump’s impeachment proceedings along with keywords like “deep state” and “WWG1WGA,” a shorthand used by followers of the QAnon conspiracy movement. About 34 percent of Ukraine’s 9.2 million total media mentions were flagged as misinformation-related.
Third was vote-by-mail, which has been the subject of a torrent of misinformation by Mr. Trump and right-wing media outlets. Roughly one out of every five vote-by-mail stories in 2020 has been misinformation, according to Zignal’s analysis, with terms like “fraud” and “scam” being common red flags.
With all three subjects, some of the most common spreaders of misinformation were right-wing news sites like Breitbart and The Gateway Pundit. YouTube also served as a major source of misinformation about these topics, according to Zignal.
Of course, the misinformation we’ve seen so far this year might pale in comparison to what happens after next week’s election, if a contested result or allegations of fraud result in a new wave of false or misleading claims. Social media platforms have signaled that they will remove premature claims of victory, and attempts to delegitimize the election. But they also pledged to take down misinformation about Covid-19, and have had only mixed success in doing so.
Here are the topics that generated the highest percentage of misinformation narratives:
1. George Soros (45.7 percent misinformation mentions)
2. Ukraine (34.2 percent)
3. Vote by Mail (21.8 percent)
4. Bio Weapon (24.2 percent)
5. Antifa (19.4 percent)
6. Biden and Defund the Police (14.2 percent)
7. Hydroxychloroquine (9.2 percent)
8. Vaccine (8.2 percent)
9. Anthony Fauci (3.2 percent)
10. Masks (0.8 percent)
With a week to go before Election Day on Nov. 3, YouTube, like other social media firms, is girding for a test of its ability to keep misinformation and other problematic videos off its site.
In a blog post on Tuesday laying out its approach, the company said it planned to apply its basic approach of removing content that violates its policies, elevating videos from authoritative sources, and limiting the spread of so-called borderline that tests the boundaries of its policies.
YouTube said it would be especially vigilant about content that encourages interference in the electoral process, such as videos inciting others to commit violent acts at polling stations or ones making false claims that mail-in ballots have been manipulated.
“Our teams have been working around the clock to make sure we have the systems and policies to prevent the abuse of our systems and provide access to authoritative information this election season,” wrote Leslie Miller, YouTube’s vice president for government affairs and public policy.
The election is a critical test of YouTube’s efforts to prevent the spread of dangerous conspiracy theories and hate speech on its platform. As the biggest repository of videos on the internet, YouTube has come under criticism in recent years for not doing enough to rein in the toxic content on its site while pushing viewers toward increasingly radical points of view.
In the days leading up to Nov. 3, YouTube’s home page will feature links to information about how and where to vote. As the polls close, YouTube will feature a playlist of live election results coverage from what it deems authoritative news sources. YouTube did not provide a full list of the sources, but cited CNN and Fox News as authoritative sources.
Starting on the day of the election, YouTube said, it will place a so-called information panel above election-related search results and below videos discussing the election. The panel will warn viewers that results may not be final and offer a link to Google’s real-time election results feature, based on information from The Associated Press.
Local election officials, politicians and disinformation researchers continue to express concern about how misinformation about voting could disrupt Election Day next week. False and misleading information, research shows, has already been spreading widely.
The 2019 race for governor of Kentucky illustrates what can go wrong, as we explored in the latest episode of “Stressed Election.” In that race, the standing governor, Matt Bevin, a Republican, disputed the results when the vote tally showed him narrowly losing to his Democratic challenger, Andy Beshear.
Mr. Bevin and some of his allies argued, without showing any evidence, that there were voting irregularities and fraud, echoing some false and misleading statements made on social media. The governor initially refused to concede even though returns showed him trailing by about 5,000 votes. Mr. Bevin conceded about a week later.
The race offers some lessons about the power of disinformation in American elections:
1. Misinformation efforts don’t need to be sophisticated to be successful. In Kentucky, an account with just 19 followers sent out a tweet on election night that claimed to have “shredded a box of Republican ballots.” The tweet, sent as a joke by a college student, would eventually reach thousands.
2. Stopping the spread of misleading election information is not easy. Election officials noticed the false “shredded” tweet, which was retweeted by a few popular conservative accounts, and reported it to Twitter. The company removed the post within an hour, but screenshots of the post were retweeted by dozens of accounts, with retweets reaching well into the thousands. Tracking all of those screenshots proved difficult for both election officials and Twitter.
3. One piece of misinformation can beget much more. The sudden spread of the false tweet about shredding ballots seemed to be a green light for other claims. Some tweets started to question the accuracy of voter rolls in Kentucky, others wondered about “hackers” attacking the “cloud” where election results were stored, except there is no “cloud” used in Kentucky elections. And baseless claims of voter fraud were rampant.
4. There are networks ready to amplify and spread misinformation. Some groups on Twitter spread countless conspiracies, be it the QAnon cabal conspiracy or an anti-mask conspiracy. These networks can quickly seize on a piece of conspiratorial misinformation and amplify and accelerate its spread, which is part of why a single tweet from an obscure account reached so many in Kentucky.
5. An extremely close election is particularly ripe for misinformation. Following election night in Kentucky, the brush fire of misinformation that was spreading online quickly took hold offline. Mr. Bevin’s supporters staged news conferences with baseless claims of fraud, and set up a robocall network telling people to “please report suspected voter fraud” to the state elections board. Online, the discussion had now moved far beyond a case of shredded ballots to accusations of a stolen or rigged election.
Twitter’s emphasis on up-to-the-second posts has made the site a must-visit destination for people to find the latest in news and current events. It has also made Twitter a vessel for the spread of false information.
To stem that tide, Twitter on Monday announced a new effort to preemptively debunk, or “prebunk” in Twitter parlance, some of the most commonly circulated false and misleading information about the election.
The company will, for the first time, pin information to the top of users’ timelines about how to vote, as well as a notice that voting results may not come immediately on Election Day — two common topics for misinformation across social media.
“We believe it’s critical that we make it easy for people to find that information,” said Nick Pacilio, a Twitter spokesman. “These prompts will alert people that they may encounter misinformation, and provide them with credible, factual information on the subject.”
The move is the latest in a series of actions taken by Twitter, Facebook and YouTube to place safeguards on their networks in the days leading up to Election Day. Lawmakers and the public harshly criticized the companies for allowing misinformation to spread ahead of the 2016 presidential election.
Facebook, which at three billion users is much larger than Twitter, has announced several changes in the past few months to stem misinformation about the election. It has started to pin facts about voting to the top of users’ timelines, added labels to posts that spread false voting information, placed a ban on new political advertising in the seven days before Election Day, and removed paid political ads entirely after the polls close.
Twitter has taken several steps, too. Last week, the company turned off some of the features that help tweets go viral faster. That includes adding an extra step to retweeting posts, and prompting users to avoid retweeting a post with a link to a news article if they had not already read the attached article.
The new pinned information will appear in the home timeline of every person with a Twitter account located within the United States, and will be available in 42 languages, beginning Monday.
The prompts will also appear in Twitter’s search bar when people search for related terms or hashtags. Each pinned alert will also link out to a collection of credible information on the subject — be it information on how to vote, or election returns — curated within a Twitter “moment” compiled from election experts, journalists and other authoritative sources of information.
In Thursday’s presidential debate, President Trump made several misleading claims about the business dealings of the family of his opponent, Joseph R. Biden Jr.
Mr. Trump suggested, without evidence, that Mr. Biden had consulted for his son Hunter Biden to help with the younger Biden’s business. Mr. Trump also said that Mr. Biden had used his influence during his time as vice president to help his son land lucrative business deals. Both claims were misleading.
But the comments nonetheless drew attention to Hunter Biden and his work, according to a New York Times analysis of Google searches and Facebook posts during and after the debate.
Searches for “Hunter Biden” on Google more than tripled during the debate compared with before the event, according to Google Trends data. Facebook posts about Hunter Biden also spiked, according to data from CrowdTangle, a social media analytics tool owned by Facebook.
Nearly 70,000 new Facebook posts popped up after the debate mentioning “false, unproven or misleading claims” about Hunter Biden’s business interactions, said Avaaz, a progressive human rights organization that studies misinformation. The majority of the posts came from Facebook pages that had been repeatedly flagged for sharing false or misleading claims, Avaaz said.
A Facebook spokeswoman said the company’s third-party fact checkers had assessed and debunked several claims related to Hunter Biden.
Mr. Trump’s comments at last month’s presidential debate also led to spikes in internet traffic. After he said that the Proud Boys, a far-right group that has endorsed violence, should “stand back and stand by,” searches for the group soared, as did posts about them on Twitter and Facebook.
Here at Daily Distortions, we try to debunk false and misleading information that has gone viral. We also want to give you a sense of how popular that misinformation is, in the overall context of what is being discussed on social media. Each Friday, we will feature a list of the 10 most-engaged stories of the week in the United States, as ranked by NewsWhip, a firm that compiles social media performance data. (NewsWhip tracks the number of reactions, shares and comments each story receives on Facebook, along with shares on Pinterest and by a group of influential users on Twitter.) This week’s data runs from 9:01 a.m. on Friday, Oct. 6, until 9 a.m. on Friday, Oct. 23.
This week, as the presidential election approached, the most viral news on social media was, surprisingly, not directly related to the election.
Of the 10 most-engaged stories on our list this week, only three — two Fox News stories and a MSNBC story — were directly linked to the candidates. Two other stories that got lots of engagement were Pope Francis’ support for same-sex civil unions and the revelation that the parents of 545 children who had been separated from their families under the Trump administration’s family separation policy were unable to be found.
Here’s the full list:
1. NBC News: Lawyers say they can’t find the parents of 545 migrant children separated by Trump administration (2,702,695 interactions)
2. NBC News: Pope Francis calls for civil union laws for same-sex couples (1,008,956 interactions)
3. New York Times: Pope Francis, in Shift for Church, Voices Support for Same-Sex Civil Unions (870,066 interactions)
4. NPR: Parents Of 545 Children Separated At U.S.-Mexico Border Still Can’t Be Found (818,591 interactions)
5. CNN: Purdue Pharma to plead guilty to federal criminal charges related to opioid crisis (798,605 interactions)
6. Fox News: Source on alleged Hunter Biden email chain verifies messages about Chinese investment firm (709,918 interactions)
7. Fox News: 50 Cent says ‘vote for Trump’ in light of Biden’s tax plan: ‘IM OUT’ (695,310 interactions)
8. NBC News: Texas social workers can now turn away LGBTQ, disabled clients (650,672 interactions)
9. MSNBC: Admiral from bin Laden raid endorses Biden in dramatic fashion (627,050 interactions)
10. ComicBook.com: Michael B. Jordan to Produce Static Shock Movie for DC Comics (520,266 interactions)
A month before the 2016 presidential election, WikiLeaks released hacked emails from John Podesta, Hillary Clinton’s campaign chairman.
Last week, The New York Post published an article featuring emails from a laptop purportedly owned by Hunter Biden, the son of the Democratic presidential nominee, Joseph R. Biden Jr. The emails, about business dealings in Ukraine, have not been independently verified.
So how did cable news treat these two caches, which were both aimed at Democratic candidates during the heights of their presidential campaigns?
The answer: Fox News is giving more airtime to the unverified Hunter Biden emails than it did to the hacked emails from Mr. Podesta in 2016, according to an analysis from the Atlantic Council’s Digital Forensic Research Lab, which studies disinformation.
While Fox News’s mentions of the word “WikiLeaks” took up a peak of 198 seconds in one day in mid-October 2016, the news channel’s references to “Hunter” reached 273 seconds one day last week, according to the analysis. Fox News did not respond to a request for comment.
In contrast, most viewers of CNN and MSNBC would not have heard much about the unconfirmed Hunter Biden emails, according to the analysis. CNN’s mentions of “Hunter” peaked at 20 seconds and MSNBC’s at 24 seconds one day last week.
CNN and MSNBC covered the WikiLeaks disclosures more, according to the study. Mentions of “WikiLeaks” peaked at 121 seconds on CNN in one day in October 2016 and 90 seconds on MSNBC in one day in the same period.
“In 2016, the WikiLeaks releases were a gigantic story, covered across the political spectrum,” said Emerson Brooking, a resident fellow at the Digital Forensic Research Lab, who worked on the report. “In 2020, the Hunter Biden leaks are a WikiLeaks-sized event crammed into one angry, intensely partisan corner” of cable news television.
As for online news outlets, 85 percent of the 1,000 most popular articles about the Hunter Biden emails were by right-leaning sites, according to the analysis. Those articles, which were shared 28 million times, came from The New York Post, Fox Business, Fox News and The Washington Times, among other outlets. The researchers did not have a comparative analysis for the WikiLeaks revelations.
President Trump has made his war on Big Tech a central piece of his re-election campaign. For months, he has accused Facebook and Twitter of attempting to rig the election by silencing criticism about his rival, former Vice President Joseph R. Biden Jr., and called for new regulations to rein in Silicon Valley giants.
But Mr. Trump is far from muzzled online. In fact, in recent weeks, he has widened his social media engagement lead over Mr. Biden.
In the past 30 days, Mr. Trump’s official Facebook page has gotten 130 million reactions, shares and comments, compared with 18 million for Mr. Biden’s page, according to data from CrowdTangle, a Facebook-owned data platform. That is significantly larger than the engagement gap for the preceding 30-day period, when Mr. Trump got 86 million interactions to Mr. Biden’s 10 million.
Mr. Trump trounced Mr. Biden on Instagram, too, getting 60 million likes and comments on his posts in the past 30 days, nearly twice as many as Mr. Biden’s 34 million. In the preceding 30-day period, Mr. Trump got 39 million likes and comments, while Mr. Biden got 13 million.
Mr. Trump also far outpaced Mr. Biden on YouTube, getting 207 million views on his videos in the last 30 days to Mr. Biden’s 29 million, according to SocialBlade, a data firm that tracks video performance. (SocialBlade’s data, which includes views on YouTube ads as well as unpaid videos, is slightly different than CrowdTangle’s Facebook and Instagram engagement data, which counts mostly engagement on unpaid posts.)
Social media performance is not a proxy for electoral success, of course, and Mr. Trump’s campaign would probably prefer to be leading in swing-state polls than on Facebook and YouTube. Engagement data also does not capture how many people view or click on posts, only how strong a reaction they elicit. And Facebook has argued that data about “reach” — the number of people who actually see a given post in their feeds — shows a more accurate picture of what is popular on the platform. (It does not, however, make this data publicly available.)
But it is useful to look at the president’s claims of partisan bias by tech companies in light of his sky-high engagement on those same companies’ platforms, because it hints at the nature of his complaints. His arguments are not the pleas of an underdog being silenced, but the threats of a star who wants to be allowed to keep his megaphone.
Some of the president’s posts in recent weeks have included misinformation about mail-in voting, dubious claims about Covid-19 and false and unproven allegations of corruption against Mr. Biden. Several of his posts have been taken down or had fact-checking labels applied to them. But these measures do not appear to have dented his account’s overall engagement.
The president’s strongest week on Facebook and Instagram came during his early October hospitalization for Covid-19, when well-wishers flooded his pages with supportive likes and comments. On YouTube, his best day came this week, when he took out a number of ads about accusations against Mr. Biden’s son Hunter, published by The New York Post. (The New York Times has not independently confirmed The Post’s reporting, and Mr. Biden’s campaign has dismissed the allegations as “Russian disinformation.”) Those ads performed well for Mr. Trump, and his channel got nearly 22 million views on Tuesday alone.
One bright spot for Mr. Biden is Twitter, where the former vice president has been performing well of late. According to Axios, which cited data from the media intelligence company Conviva, Mr. Biden has overtaken Mr. Trump in recent days when it comes to the average number of retweets and replies on his posts. (Per-post averages may be one social media contest that the president’s nonstop tweeting habit does not help him win.)
Another platform where Mr. Biden has beaten Mr. Trump? TV. His town hall on ABC last week got a bigger audience than Mr. Trump’s head-to-head NBC town hall, according to Nielsen.
And given Mr. Biden’s significantly smaller social media audience, he is punching above his weight. His Facebook page’s “interaction rate” — a measure of engagement that takes into account how many followers an account has — is currently more than twice as high as Mr. Trump’s.
QAnon conspiracy theory videos on YouTube. Homespun “remedies” for the coronavirus sent via text messages on WhatsApp. Socialist and communist memes on Twitter. Anti-Black Lives Matter posts on Facebook.
The universe of misinformation is not just widespread and vast. It is also bilingual.
For several months, researchers and Democrats have worried increasingly about misinformation in Spanish being spread through social media, talk radio and print publications that target Latino voters.
The problem has been particularly acute in South Florida, where a worrying loop of misinformation has gone from social media to mainstream and back again.
Some of the most insidious messages have tried to pit Latinos against supporters of Black Lives Matter, by using racist language and tropes. But the distortions hardly stop there.
Other news outlets have reported on the phenomenon in recent weeks, and taken together, the reports paint a picture of just how deep and wide the misinformation has spread.
Last month, Politico published an article examining efforts to paint the billionaire Democratic fund-raiser George Soros as the director of “deep state” operations and exploring anti-Black and anti-Semitic efforts that have spread across Spanish-language channels in the Miami area. A local Univision station soon followed with its own article.
A Florida public radio station found that conservative elected officials in Colombia were also helping to push the false idea that Joseph R. Biden Jr. is a clone of left-wing dictators in Latin America, such as Hugo Chávez.
This week, an article in the Boston Globe looked at how the spread of misinformation has driven a wedge between many younger Latino voters and their parents.
It is still too early to tell just what impact, if any, the misinformation is having on who shows up to the polls and who they vote for. But many experts worry that the efforts will only increase in the final days of the campaign, in an attempt to suppress the votes of some Latinos. Understanding how the misinformation spreads in any language could prove key in interpreting the election’s results.
Most people know TikTok for its short-form viral videos, like break-dancing stars or relaxing cooking channels. But TikTok also has a less-publicized darker side — one where Holocaust deniers and QAnon conspiracy theorists run rampant.
This week, the company announced a series of policy changes restricting the types of content it would allow, including a crackdown on QAnon supporters and a prohibition of “coded” language that could serve to normalize hate speech across TikTok.
“These guidelines reflect our values, and they make clear that hateful ideologies are incompatible with the inclusive and supportive community that our platform provides,” TikTok said in a corporate blog post on Wednesday. The approach will not only target outright hate speech and Nazi paraphernalia, but less obvious references to white supremacist groups as well.
The changes expand on TikTok’s existing policies, which had long banned certain forms of hate speech and direct references to Nazism and white supremacy.
The company now, for instance, also bans “coded language and symbols that can normalize hateful speech and behavior.” Some examples include numbers, code words or visual cues that are widely seen as signals to white supremacist groups.
Earlier this week, TikTok announced a wider ban of posts and users related to QAnon, the pro-Trump conspiracy theory, which included expanding a ban on hashtags related to the digital movement.
TikTok’s changes follow in the footsteps of its larger and more popular contemporaries. Over the past month, Facebook and Twitter have each introduced a series of changes to policies on what types of speech are allowed on their services.
Together, the changes represent a retreat from these companies’ long-held embrace of unfettered free speech. In the past, Twitter employees referred to their company as “the free speech wing of the free speech party,” erring on leaving all forms of objectionable content up on its site. That position has waned over the past two years, and especially in the past few months, with the company adding labels and in some cases taking down tweets entirely when they become an issue of public safety.
It is a distinct reversal for Mark Zuckerberg, chief executive of Facebook, in particular. One year ago, Mr. Zuckerberg championed mostly unfettered free speech on Facebook in a full-throated defense of his content policies in an address at Georgetown.
His views have changed abruptly. Over the last month, Facebook has banned buying advertising that supports anti-vaccination theories, further cracked down on QAnon’s presence and outlawed all forms of Holocaust denial on the platform. All three of those were positions Mr. Zuckerberg defended as views that he may not have personally agreed with but would still be allowed on the site.
TikTok used its announcement on Wednesday to take a thinly veiled swipe at Mr. Zuckerberg’s about-face.
“We’re proud that we have already taken steps to keep our community safe, for example, by not permitting content that denies the Holocaust and other violent tragedies,” TikTok wrote.
Mr. Zuckerberg has personally spoken out against Chinese-backed companies and TikTok in particular, a start-up that also happens to be a threat to his business. President Trump has made similar arguments about TikTok, saying it posed a national security threat, and moved to ban the app in the United States. That fight may also be defused by a potential sale of TikTok’s business to Oracle, though the deal is not yet complete.
For years, it was the subject of countless Fox News segments, talk radio rants, and viral right-wing tweets and Facebook posts. It spawned congressional hearings, Justice Department investigations, and investigations of those investigations. President Trump called it “the biggest political crime in the history of our country,” and suggested that its perpetrators deserved 50-year prison sentences.
Now, weeks before the election, “Spygate” — a labyrinthine conspiracy theory involving unproven allegations about a clandestine Democratic plot to spy on Mr. Trump’s 2016 campaign — appears to be losing steam.
The theory still commands plenty of attention inside the right-wing media sphere. But Mr. Trump’s quest to turn Spygate into a major mainstream issue in this year’s campaign may be coming up short. Data from NewsWhip, a firm that tracks social media performance, shows that stories about Spygate and two related keywords — “Obamagate” and “unmask/unmasked/unmasking”— received 1.5 million interactions on Facebook and from influential Twitter accounts last month, down from about 20 million interactions in May.
Part of Spygate’s fizzle may be related to the fact that three years on, none of Mr. Trump’s political enemies have been charged with crimes. Last year, a highly anticipated Justice Department inspector general’s report found no evidence of a politicized plot to spy on the Trump campaign — angering believers who thought the report would vindicate their belief in a criminal “deep state” plot against the president.
And this fall, the Spygate faithful got insult added to injury when a Justice Department investigation into one of their core concerns — whether Obama-era officials had acted improperly by “unmasking” the identities of certain people named in intelligence documents — came up empty-handed.
Few right-wing narratives have been as durable as Spygate, which has morphed over time into a kind of catchall theory encompassing various allegations of Democratic malfeasance. Fox News hosts including Sean Hannity, Laura Ingraham and Tucker Carlson went all in on it, as did Republicans in Congress, including Representative Devin Nunes of California and former Representative Trey Gowdy of South Carolina. But nobody embraced the theory like Mr. Trump, who has returned to it frequently to deflect attention from his own troubles, whether it was the Mueller investigation or his administration’s response to the Covid-19 pandemic.
As the election approaches, it’s worth looking back on Spygate’s evolution, both because it illustrates the way that partisan misinformation bubbles up through the right-wing media ecosystem, and, ultimately, because it shows how Mr. Trump’s obsession with a confusing, hard-to-follow narrative may have backfired as a campaign strategy.
Here is a (very) abridged version of the main waypoints in Spygate.
March 2017: Right-wing blogs and media outlets began discussing theories they called “DeepStateGate” or “Obamagate,” a reference to false claims that President Obama had tapped Mr. Trump’s phone.
May 2018: Mr. Trump seized on the news that an F.B.I. informant was sent to meet with members of his campaign staff, dubbing it “Spygate,” and said that it “could be one of the biggest political scandals in history.” Pro-Trump media outlets ran with the unsubstantiated claims. Top-ranking Republicans initially tried to distance themselves from the theory, although many would later embrace it.
SPYGATE could be one of the biggest political scandals in history!
— Donald J. Trump (@realDonaldTrump) May 23, 2018
April 2019: Spygate gained momentum when William P. Barr, the attorney general, testified to Congress that he believed “spying did occur” on Mr. Trump’s 2016 campaign, appearing to contradict previous Justice Department statements.
December 2019: Michael Horowitz, the Justice Department’s inspector general, released a long-awaited report detailing his findings about the origins and conduct of the F.B.I.’s Russia investigation. Mr. Trump’s media allies spent weeks hyping the report. (Sean Hannity predicted it would “shock the conscience.”) Followers of the QAnon conspiracy theory also latched onto the Horowitz report, predicting that it would set in motion indictments and mass arrests of the president’s enemies.
But the Horowitz report did not deliver a knockout punch. It revealed errors and lapses in some F.B.I. actions, but found no evidence of political bias in the F.B.I.’s Russia investigation, and rejected Mr. Trump’s suggestion that there was an organized Democratic conspiracy against him.
May 2020: As the country reeled from the Covid-19 pandemic, two developments brought Spygate (which had since been rebranded as “Obamagate”) back onto the national stage. First, the Justice Department dropped its criminal case against the former national security adviser Michael T. Flynn, a central figure in Spygate, who had pleaded guilty to lying to the F.B.I. about his conversations with a Russian diplomat.
Then, days later, a list of Obama administration officials who might have tried to “unmask” Mr. Flynn was declassified and released by Richard Grenell, the acting director of national intelligence. (“Unmasking,” in intelligence parlance, refers to a process by which officials can seek to reveal the identity of individuals who are referred to anonymously in intelligence documents. Unmasking is common, and such requests are made thousands of times a year.) Those named on the list included former Vice President Joseph R. Biden Jr., giving new fuel to Mr. Trump’s attempt to paint himself as the victim of a partisan conspiracy.
This was, in many ways, the closest that Spygate came to escaping the right-wing media ecosystem. Fox News devoted hours to the theory, which received more airtime than the coronavirus on some days. Mainstream news organizations tried to make sense of the theory, and Mr. Trump himself seemed obsessed with it, even though he often struggled to describe what the conspiracy actually was. In a flurry of more than 100 tweets sent on May 10, Mother’s Day, Mr. Trump raged about Obamagate, and repeated many of the debunked allegations about Obama-era misconduct, Mr. Flynn, and the Russia investigation.
— Donald J. Trump (@realDonaldTrump) May 10, 2020
By this point, many Trump supporters had pinned their hopes on two government reports, which they hoped would soon blow the entire scandal wide open.
The first was a sweeping investigation led by John Durham, the U.S. attorney from Connecticut who was tapped by Mr. Barr to look into the origins of the F.B.I.’s Russia probe.
The second was a smaller piece of the Durham investigation led by John Bash, a U.S. attorney Mr. Barr appointed to look into whether Obama-era officials had improperly “unmasked” Mr. Flynn and others.
October 2020: With less than a month to go before the election, Spygate/Obamagate continued to unravel. Mr. Barr has told Republican lawmakers that Mr. Durham’s report would likely not arrive before the election. And the unmasking investigation led by Mr. Bash, which many Spygate aficionados believed would lead to indictments and arrests of top Democrats, instead ended with no findings of irregularities or substantive wrongdoing.
Still, for Mr. Trump, hope springs eternal. He has continued his crusade, comparing Spygate to a “treasonous act” that should disqualify Mr. Biden from the presidency.
Obama, Biden, Crooked Hillary and many others got caught in a Treasonous Act of Spying and Government Overthrow, a Criminal Act. How is Biden now allowed to run for President?
— Donald J. Trump (@realDonaldTrump) October 8, 2020
WASHINGTON — When some viewers in Arkansas tuned in to their local television news station last week, they found a surprising report: President Trump had defeated Joseph R. Biden Jr. in the state — three weeks before Election Day.
KNWA, the NBC affiliate serving northwest Arkansas and the Arkansas River Valley, said it was all a mistake. The station had been working on its election-night graphics and mistakenly broadcast fabricated results on a banner at the bottom of the screen during its 5 p.m. local newscast.
In an email, Lisa Kelsey, the vice president and general manager of KNWA and other stations in the area, said the slip-up was inadvertent and only a local issue.
A producer activated the wrong control, which displayed “a crawl of information about the election” for about a minute, she wrote, adding that no election results are currently available.
“We take this mistake very seriously and will ensure it doesn’t happen again,” Ms. Kelsey said in an email.
But the episode highlighted concerns about how news organizations report and characterize incomplete returns on election night and whether, by mistake or design, erroneous or misleading data could shape perceptions about who won before the outcome can be officially declared.
The issue has been a particular concern for Democrats, who fear that Mr. Trump’s statements about election fraud and his reluctance to commit to accepting the outcome could lead him to seize on early returns showing him with a lead to assert that the election is over.
Hi, Sue. Our team was working on our election graphics this afternoon and someone accidentally put the election scroll on TV instead of switching it to the news headlines scroll that we normally use during our show. I’m really sorry for the mistake.
— Chad Mira (@ChadMiraKNWA) October 13, 2020
A fast-growing network of nearly 1,300 websites is filling a void left by vanishing local newspapers across the country. But many of their stories are ordered up by conservative political groups and corporate P.R. firms, a Times investigation found.
We are publishing the names of those sites so readers can see whether the sites target their area.
We compiled the list with the help of Global Disinformation Index, an internet research group, which analyzed Google advertising and analytics data imprinted in the sites’ digital codes to find links between the sites. We then confirmed that sites belonged to the network by analyzing their layouts, bylines, privacy policies and “About” pages, as well as by interviewing employees and examining internal records of the companies behind the sites.
Columbia University’s Priyanjana Bengani tallied a similar number of websites in August.
The network is run under a web of companies, though it is largely overseen by Brian Timpone, a former TV reporter who has sought to capitalize on the decline of local news organizations for nearly two decades. Mr. Timpone did not respond to multiple requests for comment.
As a guide, the different segments of the network include nearly 1,000 local news sites under the Metric Media brand; more than 50 business news sites; 34 news sites in Illinois under the Local Government Information Services brand; and 11 legal-news sites owned by a U.S. Chamber of Commerce group.
Some of the sites are dormant, and we culled ones from our list that are now defunct. In the past, dormant sites have sprung to life when news hit the region they target, like what happened with the Kenosha Reporter site after protests broke out in Kenosha, Wis., over the police killing of an unarmed Black man there.
For months, public health experts — backed by guidelines from the World Health Organization and the Centers for Disease Control and Prevention — have stood firm on one resounding refrain: Against the coronavirus, masks work.
But on Saturday, Dr. Scott Atlas, one of President Trump’s most prominent science advisers, took to Twitter to say otherwise.
“Masks work? NO: LA, Miami, Hawaii, Alabama, France, Phlippnes, UK, Spain, Israel,” Dr. Atlas tweeted, rattling off a list of locations where masks had, in his view, failed to protect large swaths of the population.
The tweet was rapidly debunked by experts, who pointed to a wealth of evidence showing that face coverings reduce the risk that the coronavirus will hop from person to person. Masks, they’ve said, cut down on the amount of virus that is sprayed out of an infected person’s airway. They might also thwart inbound virus by loosely shielding the wearer’s nose and mouth.
Not long after, Dr. Atlas reshared his first tweet with a message that seemed to walk back his original statement: “Use masks for their intended purpose — when close to others especially hi risk,” he said. “Otherwise, social distance. No widespread mandates.”
On Sunday, Twitter removed Dr. Atlas’s first tweet, saying it violated the company’s policy against false or misleading information about the coronavirus that could lead to harm.
But the damage had already been done: The post had been retweeted at least 1,800 times, and generated over 7,300 likes and replies. The removal then set off a flurry of anti-mask posts, and accusations of tech censorship, across social media. On Facebook, several right-wing pages shared copies of the tweet, while a series of anti-mask and pro-Trump groups and pages claimed that Twitter was suppressing free speech.
Dr. Atlas, a radiologist with no background in infectious disease or public health, has come under heavy fire in recent months for his stances on the coronavirus, which has killed more than 219,000 Americans. Experts have widely dismissed and criticized his views on lockdowns and masking mandates after he has derided them as unnecessary and even harmful in the fight to halt the pandemic.
Dr. Atlas has also promoted the controversial idea that herd immunity — the point at which a virus can no longer spread easily because enough people have contracted it — can be reached when only a small sliver of the community at large has been infected.
In his now-defunct Saturday tweet about masks, Dr. Atlas cast doubt on their usefulness, saying there was little evidence that they reduce disease transmission. As a send-off, he shared a link to an indictment of face coverings published on Friday by the American Institute for Economic Research, a libertarian think tank that recently sponsored a declaration arguing that the coronavirus should be allowed to spread among young healthy people to expedite herd immunity.
Masks, like all other protective measures, cannot halt the coronavirus on their own. But experts consider the accessories a crucial part of the public health tool kit needed to combat the pandemic, alongside tactics such as physical distancing and widely available testing.
Here at Daily Distortions, we try to debunk false and misleading information that has gone viral. We also want to give you a sense of how popular that misinformation is, in the overall context of what is being discussed on social media. Each Friday, we will feature a list of the 10 most-engaged stories of the week in the United States, as ranked by NewsWhip, a firm that compiles social media performance data. (NewsWhip tracks the number of reactions, shares and comments each story receives on Facebook, along with shares on Pinterest and by a group of influential users on Twitter. This week’s data runs from 9:01 a.m. on Friday, Oct. 9, until 9 a.m. on Friday, Oct. 16.
The most viral article on social media this week was one that social media companies tried to stop from going viral.
Facebook said it would reduce the visibility of an unsubstantiated New York Post article about Hunter Biden, the son of Joseph R. Biden Jr., until a third party could fact-check it. Twitter initially banned all links to the article, saying it made the move because the article contained images showing private personal information and because it viewed the article as a violation of its rules against distributing hacked material. But the article still traveled widely on social media, receiving more than two million interactions.
Here is the full list of the week’s most-engaged stories:
1. New York Post: Smoking-gun email reveals how Hunter Biden introduced Ukrainian businessman to VP dad (2,307,293 interactions)
2. ComicBook.com: Two and a Half Men Star Conchata Ferrell Dies at 77 (1,863,725 interactions)
An obituary for Ms. Ferrell, who played Berta, the housekeeper, on “Two and a Half Men,” was shared widely by the show’s many fans.
3. Fox News: Rep. Doug Collins introduces resolution to push for Pelosi removal as House speaker (1,109,988 interactions)
Mr. Collins’s resolution, which claimed that Representative Nancy Pelosi “does not have the mental fitness” to continue as House speaker, was a largely meaningless symbolic gesture of opposition. But it was red meat for conservatives on Facebook, for whom Ms. Pelosi is an engagement-bait villain.
4. CNBC: Facebook, Twitter make editorial decisions to limit distribution of story claiming to show ‘smoking gun’ emails related to Biden and his son (1,032,917 interactions)
5. ET Online: ‘Dexter’ Revival Starring Michael C. Hall Set at Showtime (960,226 interactions)
Another break from politics, this one about a planned revival of the hit TV show “Dexter,” got nearly a million interactions.
6. The Daily Wire: ‘Legendary’: Barrett Asked To Hold Up Notes She’s Using To Answer Questions. She Holds Up A Blank Notepad. (881,469 interactions)
Judge Amy Coney Barrett’s Supreme Court confirmation hearing was the subject of two Top 10 articles this week. This one, from the right-wing news site The Daily Wire, focused on her empty notepad.
7. Fox News: Judge Amy Coney Barrett to face Senate confirmation hearing (872,589 interactions)
8. Whitehouse.gov: Proclamation on Columbus Day, 2020 (861,279 interactions)
A White House proclamation about Columbus Day, which took aim at “radical activists” who “have sought to undermine Christopher Columbus’s legacy,” was widely shared by right-wing pages on Facebook and by groups like the National Italian American Foundation.
9. Fox News: Pelosi to announce bill on 25th Amendment after questioning Trump’s health (795,962 interactions)
10. The New York Times: California Republican Party Admits It Placed Misleading Ballot Boxes Around State (722,101 interactions)
A Times article about unofficial ballot boxes that Republican operatives placed in California was shared by several large left-wing Facebook pages, including Occupy Democrats and Ridin’ With Biden.
On Friday, President Trump tweeted a story from an unusual source: The Babylon Bee, a right-wing satire site that is often described as a conservative version of The Onion.
“Twitter Shuts Down Entire Network to Slow Spread of Negative Biden News,” read the story’s headline. The story was a joke, but it was unclear whether Mr. Trump knew that when he shared the link, with the comment “Wow, this has never been done in history.”
Twitter Shuts Down Entire Network To Slow Spread Of Negative Biden News https://t.co/JPmjOrKPcr via @TheBabylonBee Wow, this has never been done in history. This includes his really bad interview last night. Why is Twitter doing this. Bringing more attention to Sleepy Joe & Big T
— Donald J. Trump (@realDonaldTrump) October 16, 2020
Emma Goldberg, a reporter for The New York Times, recently profiled The Babylon Bee, and wrote about how the site’s satire is frequently mistaken for reality.
I chatted with Ms. Goldberg about her article, The Babylon Bee’s habit of skirting the line between misinformation and satire, and how it capitalizes on its audience’s confusion.
So, Emma, you wrote about The Babylon Bee, a satirical news site I’ve been fascinated by for a long time. It’s basically the right-wing version of The Onion, right?
Exactly. And what fascinated me in reporting this is that I’ve followed The Onion for a long time — but The Babylon Bee currently gets more traffic than them, at least according to their internal numbers.
That’s so interesting! (As an aside, I’m looking at some engagement data from Facebook now, and it’s telling me that The Babylon Bee has gotten about 45 million interactions with its Facebook page in the last year, compared with 35 million for The Onion.) Why do you think The Bee is doing so well?
Well, they certainly don’t pull any punches. Their mantra seems to be that everything is fair game: the left, the right, Trump. And in general, on the right, swiping at Trump is considered a red line, but The Bee doesn’t seem to care.
They’ve also tapped into a large audience of people who aren’t hard-line Trumpers, but are much more pissed off by the outrage that Trump generates on the left.
Right, sort of the anti-anti-Trump crowd. And the people who run the site, are they pro-Trump? What do they see themselves as doing, within the larger conservative movement?
They are ambivalent about their views on Trump, but they also proudly identify as Christian conservatives. But I noticed that their early coverage of Trump, back in 2016, was much more vitriolic than today’s. They called him a psychopath, or a megalomaniac. Now they’re more bemused by him and the ghoulish ways he’s described on the left.
But I think their willingness to swipe at him, even gently, gets at an important element for successful humor. What media scholar Brian Rosenwald told me is that the humor always has to come before the politics.
So this is a blog about distortions and misinformation, and one thing I’ve noticed recently is that a lot of The Babylon Bee’s most successful articles in terms of online engagement are the ones that are … less obviously satirical.
Totally. And that’s landed them in some hot water.
Like, one from the other day was called “NBA Players Wear Special Lace Collars to Honor Ruth Bader Ginsburg.”
People were sharing that thinking it was real.
They certainly play to that for virality — their best content is right on the reality-satire line.
I’m wondering the extent to which being a satire site — which makes them exempt from Facebook’s fact-checking program — has allowed them to traffic in misinformation under the guise of comedy. Do you think that’s a deliberate strategy?
Well, that’s a great question, because it’s been a big source of controversy for them. They’ve had a few articles that were fact-checked by Snopes and rated “false.” Which The Bee’s writers and editors claim prompted Facebook to threaten them with being demonetized (Facebook denies this). The Bee’s founder, Adam Ford, has claimed that Snopes fact-checked them in ways that were “egregious,” with standards that wouldn’t be applied to, for example, The Onion.
The Bee feels that they’re being targeted unfairly. But Snopes has poked at the fact that their pieces can sometimes be easily mistaken for real news — which might fall on them, not their readers.
Politics aside, it sort of speaks to the impossible nature of being a satirical site in the age of the mega-platform. Because on one hand, you’ve got to write things that are so obviously made up that they can’t reasonably be mistaken for real news, but also close enough to the truth to be funny.
One hundred percent. Truth is funnier than fiction these days.
One thing I’ve wondered is what the whole “owning the libs” media industrial complex (which I’d categorize The Bee as belonging to, even if they wouldn’t) will do if Trump loses in November. Do you get the sense that The Bee cares who wins the election, from the standpoint of comedic potential?
What’s funny is that because they aren’t Trump loyalists, they can see an advantage for their comedy either way. In some senses, comedy comes a lot easier when you’re not the party in power. But on the other hand, Trump is such an absurd figure that he can lend himself to some really wild caricatures. The editor in chief of The Bee told me Trump is great for comedy, so he’d be happy to see him win — a little later, he added that maybe they’re sick of Trump humor and ready for a change. They also see a lot of humor opportunity in the Biden camp, especially playing off the “Sleepy Joe” motif.
So what I’m taking from this conversation is: The Babylon Bee is not a covert disinformation operation disguised as a right-wing satire site, and is in fact trying to do comedy, but may inadvertently be spreading bad information when people take their stories too seriously?
For the most part. But they also seem to find it pretty funny when their content is mistaken for real news — and they’re not exactly going overboard to stop that.
In all the uproar over how tech companies have handled an unsubstantiated article about Hunter Biden from the New York Post, one major company has stood apart: YouTube.
It has said nothing. And what it has done, if anything, remains a mystery.
On Wednesday, the New York Post uploaded a one-minute, 17-second video highlighting the key points of the article to its YouTube channel, which has more than 430,000 subscribers. For most of that day, users who searched for “Hunter Biden” on YouTube saw the video at the top of the site’s “Top News” shelf. As of midday Thursday, the video had 100,000 views — a respectable figure but certainly not the stuff of viral videos.
In recent years, YouTube has made changes to its “recommendation algorithm” for what it calls borderline content — the types of videos that toe the line between what is acceptable on the platform and what it considers to violate its policies. As a result of those changes, YouTube limits such content from being recommended and keeps the videos from appearing prominently in search results or on its home page.
About 36 hours after the video was posted, YouTube said it would remain up without restriction. “Given the information currently available, content about this news story is allowed on YouTube. We will continue to evaluate content against our policies as new details emerge,” said Farshad Shadloo, a YouTube spokesman.
The response from YouTube stood in sharp contrast to the immediate and public reaction from Facebook and Twitter. Facebook said it would limit the distribution of the article on its platform so that third-party fact checkers could verify the claims. Twitter said it was blocking the article because it included people’s personal information, violating its privacy rules, and because the article violated its policy on hacked materials.
The Senate Judiciary Committee plans to subpoena Jack Dorsey, Twitter’s chief executive, to testify on Oct. 23 regarding the company’s decision to block the article. Mr. Dorsey, along with Mark Zuckerberg of Facebook and Sundar Pichai of Google, are also scheduled to testify on Oct. 28 about Section 230, the law that shields technology companies from being held liable for some of the content published by its users.
While the number of views on the New York Post video remain subdued, videos related to the article have done extremely well. A Fox Business interview with Stephen K. Bannon, a former White House adviser who played a role in the article, got more than 275,000 views. An interview on Fox News with Kayleigh McEnany, the White House press secretary, about getting locked out of her Twitter account after sharing the Post story garnered 795,000 views.
This week, President Trump exaggerated a position taken by the World Health Organization, saying that the agency had vindicated his derision of lockdowns during the coronavirus pandemic.
“The World Health Organization just admitted that I was right,” the president tweeted. “Lockdowns are killing countries all over the world. The cure cannot be worse than the problem itself.”
The World Health Organization just admitted that I was right. Lockdowns are killing countries all over the world. The cure cannot be worse than the problem itself. Open up your states, Democrat governors. Open up New York. A long battle, but they finally did the right thing!
— Donald J. Trump (@realDonaldTrump) October 12, 2020
Mr. Trump’s message was rapidly shared by thousands online, including the commentator Lou Dobbs and Representative Andy Biggs, Republican of Arizona, who echoed the president’s rallying cry to “open up” and described the closings as “pseudoscientific” and “tyrannical.”
Since the early days of the pandemic, the president has dismissed lockdowns as unnecessary and harmful, even while the virus continued to blaze across the nation.
Mr. Trump did not say which W.H.O. statement he was referring to. But one of the few published recent comments from a W.H.O. official about lockdowns came from David Nabarro, one of several envoys to the organization on Covid-19.
“We in the World Health Organization do not advocate lockdowns as the primary means of control of this virus,” Dr. Nabarro said earlier this month to the British magazine The Spectator. “The only time we believe a lockdown is justified is to buy you time to reorganize, regroup, rebalance your resources, protect your health workers who are exhausted. But by and large, we’d rather not do it.”
“We really do appeal to all world leaders, stop using lockdown as your primary method of control,” Dr. Nabarro said.
Dr. Nabarro described several potential tolls of widespread lockdowns, which have set off economic declines and higher unemployment rates, and have widened disparities in many parts of the world, including the United States.
Dr. Nabarro has also noted that lockdowns may be necessary under some circumstances. In addition, he has advocated for a multifaceted approach to curbing the spread of the coronavirus — a strategy he recently outlined in a written reflection that highlighted the importance of physical distancing, mask-wearing, accessible testing and contact tracing, among other measures, to pinpoint and suppress outbreaks.
In a statement, Hedinn Halldorsson, a spokesman for the W.H.O., reaffirmed that the pandemic needed to be addressed with such a “package” of protective tactics.
“W.H.O. has never advocated for national lockdowns as a primary means for controlling the virus,” he said. “Dr. Nabarro was repeating our advice to governments to ‘do it all.’”
Some countries, like New Zealand, used lockdowns to great success to tame their outbreaks. Others, like South Korea, were able to circumvent them by pushing hard on testing. All success stories, however, have one thing in common: swift action to acknowledge and beat back the virus.
Lockdowns are extreme, and inevitably come with costs, said Syra Madad, a public health expert and epidemiologist based in New York. But they can afford communities much-needed time to ready other methods of containment.
“Had the U.S. been better prepared and responded faster,” Dr. Madad said, perhaps “lockdowns could have been avoided.”