Twitter's Head Of Site Integrity, On Fighting Election Disinformation
ARI SHAPIRO, HOST:
Intelligence officials have warned that Russia is interfering in the 2020 presidential campaign as it did four years ago. Back then, it used social media to spread disinformation and hoaxes. Yoel Roth is in charge of fighting these efforts on Twitter. He told me the company hasn't traced specific tweets about the 2020 campaign back to Russia, but he says Russia's tactics have also evolved.
YOEL ROTH: One of the main things that we saw in 2016 was the use of inauthentic personas. So these would be accounts that were pretending to be Americans to try and influence certain parts of the conversation. We've seen some indication that that remains part of the Russian playbook. And so they're trying to set up accounts that appear to be Americans or other people participating in political conversations to try and seem as though they are actually a member of the community that they're trying to impact.
SHAPIRO: And beyond candidates, these might be about gun rights or Black Lives Matter or LGBT rights or anything else that was controversial. At least that's what was documented in the Mueller report.
ROTH: Right. Those were some of the tactics that we saw most clearly in 2016, but I'd note that some of the tactics have evolved a little bit since then. For instance, in 2018, we saw activity that we believe to have been connected with the Russian Internet Research Agency that was specifically targeting journalists in an attempt to convince them that there had been large-scale activity on the platform that didn't actually happen.
SHAPIRO: Oh, so you're saying that in 2018, Russia didn't just interfere in the election. They tried to make it look like there was more interference in the election than there was just to undermine confidence.
ROTH: That's right. So on election night 2018, a website went up that was called IRAUSA. So it explicitly indicated that it was the Internet Research Agency in the United States.
SHAPIRO: And that's the so-called troll farm operated out of St. Petersburg, Russia.
ROTH: Yes. And this website's claim was a very simple one. They said, we have been setting up thousands of accounts, and we were interfering in the 2018 midterms the way that we interfered in 2016.
SHAPIRO: So that's what was new in 2018. I know it's early days still, but what's your best sense of what's new in 2020?
ROTH: I think in 2020, we're facing a particularly divisive political moment here in the United States, and attempts to capitalize on those divisions amongst Americans seem to be where malicious actors are headed. This is a similar pattern to what we saw in 2016 and 2018, but one of the things that we've seen from not only Russia but a wide range of malicious actors is an attempt to capitalize on some of the major domestic voices that are participating in these conversations and then double down on some of those activities.
SHAPIRO: And you're saying they don't have to create their own original messaging. They can just amplify some of the most extreme messaging that's coming out from real-life, authentic Americans.
ROTH: That's right.
SHAPIRO: I want to ask you about a slightly different topic. The distortion of reality in the election is not just coming from foreign actors. American political campaigns are also trying to use social media to their advantage. What are you seeing from 2020 candidates that raises flags for you?
ROTH: We've seen everything ranging from accounts that are pretending to be compromised - so they'll sort of pretend that they were hacked and then share content that they might otherwise not have, but it was actually a real person - to sort of large-scale attempts to mobilize volunteers to share content on the service.
SHAPIRO: I understand you've suspended dozens of pro-Bloomberg accounts for platform manipulation. These are real people that the campaign paid to promote the candidate. How do you know that you're not overreaching and suspending accounts of Americans who are expressing their real beliefs?
ROTH: The first thing I'd emphasize is that we don't know whether these accounts were, in fact, operated by the campaign or not. Our focus whenever we're enforcing our policies is to look at the behavior that the accounts are engaged in, not who we think is behind them or what their motivations were because, in most instances, we don't know. And so we enforced our policies against a number of accounts that were engaged in spam. These were accounts that were all tweeting the same thing at the same time. And so behaviorally, we were able to clearly identify that those accounts were in violation of our rules.
SHAPIRO: Are you saying that if I text all of my supporters and say, please tweet, Ari will make my life better, and a thousand people tweet, Ari will make my life better, that would be spamming, even though those are real people who might actually believe I'll make their life better?
ROTH: No. The behavior that you described - a lot of people getting together and organizing and mobilizing around a topic and sharing information about it - isn't by itself spam. But if each one of your supporters were to set up 10 different accounts just for the purpose of tweeting about how much they like you, that's something that would cross the line on some of our policies around coordinated manipulation.
And so the action that you saw around some of these campaign accounts was really that a single individual was operating multiple accounts with the intent of disseminating these messages. Individual people who are participating in a conversation in an organic way are, of course, free to use the service to advocate on behalf of whomever or whatever they choose. You just can't do it in a way that crosses the lines of coordinated manipulation and spam.
SHAPIRO: Yoel Roth is Twitter's head of site integrity.
Thank you for the conversation.
ROTH: Thank you.
SHAPIRO: And tomorrow, we'll hear about Twitter's new policy to flag misleading content. Transcript provided by NPR, Copyright NPR.