Updated 10:14 p.m. Monday ET
TikTok is toughening its stance against the QAnon conspiracy theory, expanding its ban to all content or accounts that promote videos advancing baseless ideas from the far-right online movement.
The action hardens the video-sharing app's previous enforcement against QAnon that targeted specific hashtags on the app that QAnon supporters have used to spread unfounded theories. Now, users that share QAnon-related content on TikTok will have their accounts deleted from the app.
"Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform," a TikTok spokesperson said in a statement to NPR. "We've also taken significant steps to make this content harder to find across search and hashtags by redirecting associated terms to our Community Guidelines."
TikTok's sweeping action against QAnon comes just as Facebook, Twitter, YouTube and other technology giants have announced bans on content from the Trump-supporting conspiracy theory. QAnon started in October 2017 and has amassed an enormous following online thanks largely to social media companies.
"There should be recognition of a thing that is good and significant, even if it's long overdue," said Angelo Carusone, president of the liberal nonprofit watchdog group Media Matters for America. "TikTok is recognizing that by the nature of the QAnon movement, you can't just get rid of their communities, the content itself is the problem."
Earlier this month, Media Matters identified more than a dozen hashtags TikTokkers used to spread QAnon conspiracy theories about President Trump's positive coronavirus test, false beliefs about Democratic presidential candidate Joe Biden and videos questioning the reality of the pandemic.
"We're talking about hundreds of millions of video views just for a limited segment of QAnon communities that we identified," Carusone said.
TikTok, which has 100 million monthly active users in the U.S., made its expanded ban against QAnon quietly in a statement to Media Matters, where it garnered little attention. A TikTok spokesperson confirmed the policy to NPR on Saturday. A company official said it has been TikTok's internal policy since August.
Hany Farid, a UC Berkeley computer science professor who is a member of TikTok's committee of outside content moderation experts, said there is tension within social networks over how to respond to misinformation without also amplifying the underlying theories.
"When you ban it, you give it credibility. You give it attention," Farid told NPR.
"But the movement got big enough and dangerous enough that people were looking at the landscape and saying, 'Yeah, this is completely out of control,' " he said. "Were they slow to do it? Probably. But platforms get criticized when they act too quickly. So there is a dilemma there."
TikTok uses a mix of artificial intelligence and thousands of human content moderators to try to curb troubling content. The Chinese-owned app is best-known for viral dance challenges and comedic performances.
According to TikTok's Community Guidelines, misinformation that "causes harm to individuals, our community or the larger public" is prohibited on the site, including medical misinformation, which QAnon has engaged in by pushing false notions about the deadly coronavirus.
Carusone of Media Matters said misinformation accounts on TikTok have been clever about avoiding detection by hijacking otherwise-benign hashtags, or creating new hashtags that are written slightly in code, among other strategies to evade efforts to curb the content.
"The test of this policy will be how much it affects the creation and germination of new QAnon content on TikTok," Carusone said. "If you know your video is going to be eliminated before it has a chance to spread, you're less likely to spend time polluting the TikTok pool."
The future of TikTok in the U.S. remains uncertain. A federal judge last month temporarily halted a Trump administration attempt to shut down the app. But a separate order from the White House for TikTok to divest from its Beijing owner or cease operations remains in place, with a deadline of Nov. 12 for TikTok to find an American buyer or close down its U.S. operations.
Trump officials cite national security concerns with TikTok's China-based corporate owner, ByteDance, but TikTok has long dismissed the effort as an crusade to score political points. The company says U.S. user data is controlled by an American-led team and that the Chinese government has never requested access to the data.
NOEL KING, HOST:
TikTok at its best is the joke dance videos. We all know that. But it is not all fun stuff. The platform is confronting a surge of misinformation and extremist content. So TikTok is learning from Facebook and Twitter, which have been dealing with this for a while. Here's NPR's Bobby Allyn.
BOBBY ALLYN, BYLINE: Early on in the pandemic, a guy in Brooklyn walked by a hospital and took out his phone. He opened TikTok and started filming. It was part of a disinformation campaign to frame the coronavirus as a hoax.
(SOUNDBITE OF TIKTOK VIDEO)
UNIDENTIFIED PERSON: Not much happening. And this is a hospital that serves thousands and thousands of people here in downtown Brooklyn, N.Y. There is no mass chaos out here, contrary to what the mainstream media is telling you.
ALLYN: He went by the username saynotosocialism on TikTok. Now, of course, believing that recording the outside of a hospital proves anything at all is absurd. But the Trump-supporting conspiracy theory QAnon endorsed the idea. And on TikTok, it caught on.
ANGELO CARUSONE: More people saw it. More people were going to film their hospital, which in turn was getting them video views and incentivizing more people to do it. So it was this incredible feedback loop.
ALLYN: That's Angelo Carusone, president of the left-leaning watchdog group Media Matters, which found more than a dozen QAnon hashtags on TikTok that together garnered hundreds of millions of views. TikTok has since banned all QAnon content. But Carusone says TikTok's algorithm rewards engagement. And hoaxes - they could be really engaging.
CARUSONE: The algorithm is sort of like the recommendation engines of YouTube but on steroids.
ALLYN: The power and reach of the app isn't lost on other fringe groups. Earlier this month, Brandon Caserta was arrested in connection with a plot to kidnap the governor of Michigan. Before that happened, though, he posted this TikTok, previewing his, quote, "recon plan."
(SOUNDBITE OF TIKTOK VIDEO)
BRANDON CASERTA: Guess what? I'm sick of being robbed and enslaved by the state, period. I'm sick of it. And these are the guys who are actually doing it.
ALLYN: TikTok took down Caserta's video and the earlier one of the hospital. TikTok says it will remove all accounts sharing hate and misinformation. But in reality, some extremists have learned how to outfox the app's defenses - both human and AI. The Anti-Defamation League's Dave Sifry found that white supremacists are using code language on TikTok to find new recruits.
DAVE SIFRY: The number one for the letter I or L. And so you might see n-4-z-l standing for Nazi.
ALLYN: Another common tactic among extremists on TikTok is attempting to hijack a trending topic.
SIFRY: They might use the hashtag #BlackLivesMatter as a way to actually promulgate some of their hateful ideologies to people who are searching for other kinds of videos.
ALLYN: But those who work with TikTok say it deserves credit for not repeating the mistakes of older social networks like Facebook and Twitter. TikTok has learned that a scattershot method of just taking down bad stuff as it appears doesn't work. The controls have to be proactive. Hany Farid is a UC Berkeley computer science professor who sits on a TikTok advisory panel.
HANY FARID: Ah, here's a piece of content. Ah, here's a group. Let's ban this. Let's not ban that. And there's no coherent policy. And I think TikTok has the advantage of coming to this game relatively late and seeing what has not worked with the other social media companies.
ALLYN: Unlike other social networks, Farid says TikTok sees itself as entertainment. And that's mostly what you see on the platform - dances, jokes, pranks, ridiculous dog videos. Hateful and violent videos are made, too, though TikTok is good at catching it. Still, Farid admits it's there if you look hard enough.
FARID: And I'm not excusing it. I'm not saying, well, you know, that's the price you pay. But, you know, just playing defense is hard. And no matter how good the algorithms get, no matter how many moderators you hire, there's going to be mistakes.
ALLYN: TikTok has twice as many U.S. users as Twitter. Those are a lot of people making a lot of videos. Mistakes are just an unavoidable result of trying to police them all. Bobby Allyn, NPR News, San Francisco.
(SOUNDBITE OF PENSEES'S "LAGUNA") Transcript provided by NPR, Copyright NPR.