How China spreads misinformation around the world
Early last year when U.S. health experts looked into the possibility that COVID-19 leaked from a lab in China — either accidentally or on purpose — Chinese propaganda-makers responded.
They amplified rumors that the virus came from the United States, specifically from an Army research lab called Fort Detrick in Maryland. And Chinese state media has succeeded with the help of search engines like Google and YouTube.
Fort Detrick, a bioresearch lab about 60 miles from Washington, D.C., that researches issues related to bio-defense, has long been connected to conspiracy theories, says Bret Schafer, a senior fellow at the Alliance for Securing Democracy. The Soviet KGB connected the lab to the AIDS pandemic in the 1980s.
“[Fort Detrick] has often played this sort of central role in conspiracy theories,” Schafer says. “But this one, of course, has tried to connect the origins of COVID-19 to the lab by essentially saying that the outbreak jumped from Fort Detrick to Wuhan brought over by members of the U.S. military.”
A Google search of Fort Detrick pulls up articles like one from a Chinese state media outlet called the Global Times titled “Why Fort Detrick lab should be investigated for global COVID-19 origins tracing.”
“Often what we see with searches, particularly conspiracy-related topics, is there is a bit of a void in the sort of search results that you will get,” he says, “because mainstream media outlets are not going to amplify sort of random crazy conspiracy theories. They may cover it once or twice, but then they move on.”
It’s a good thing that legitimate media outlets don’t amplify such rumors, he says, but it gives conspiracy theorists the advantage. Conspiracy theorists can post about the same topic over and over again so their perspective shows up when people search certain keywords.
Vloggers and citizen journalists will publish many videos on one conspiracy theory on YouTube, while most reputable news outlets produce multiple pieces on the topic, he says.
“YouTube has always been a problematic space for conspiracy theories to flourish,” Schafer says. “Especially around Fort Detrick, where China has many, many, many global outlets with large follower numbers. And so they’ve really dominated YouTube search results.”
Part of Google’s success stems from giving users relevant results, he says. For example, a search for ‘Major League Baseball playoffs’ will show results about the scores from last night’s games rather than one from 1985.
The algorithm ranks content based on how new it is, he says. NPR or The Washington Post may cover a conspiracy once, but Chinese state media can game the algorithm by posting newer content on the same topic and shoot to the top of the feed.
“Without question, if you’re talking about the Chinese or the Russians, others, there’s an understanding of search engine optimization,” he says. “They know how to get their content in front of targeted audiences.”
Chinese state media isn’t manipulating the system, Schafer says, but rather working the algorithm to its advantage.
“They certainly have that understanding that if you repeat a lie enough from enough different sources, you’re just more likely to have that show up in front of the audience who you want to see it,” he says.
The biggest success of conspiracy spreaders is making it harder for people to figure out what’s true — and whether truth exists at all, Schafer says.
Most Americans aren’t buying into the Fort Detrick conspiracy theory, but in the last two weeks a new story arose that the virus was brought to Wuhan by frozen lobster imported from Maine, he says.
“These are ridiculous on the surface,” he says. “But if you push enough of these sort of competing narratives out there, people just become confused and sort of overwhelmed by the amount of competing stories and competing narratives that the truth becomes less attainable.”
Americans might not believe this theory, but Fort Detrick has been a top search result at times in China, Schafer says.
To make sure credible information surfaces above rumors, search engines like Google use certain techniques to adjust the algorithm around topics often at the center of conspiracy theories such as 9/11 or the moon landing, he says. On topics like the Holocaust, for example, the algorithm puts credible information higher than it normally would.
“If you’re talking about just the sort of normal, operational, run a business of a search engine,” he says, “there is not much they can do when there’s a lack of credible competing information to fill that void.”
Julia Corcoran produced and edited this interview for broadcast with Todd Mundt. Allison Hagan adapted it for the web.
This article was originally published on WBUR.org.
Copyright 2021 NPR. To see more, visit https://www.npr.org.