Facebook And Twitter Remove Russia-Backed Accounts Targeting Left-Leaning Voters | WYPR

Facebook And Twitter Remove Russia-Backed Accounts Targeting Left-Leaning Voters

Sep 1, 2020
Originally published on September 2, 2020 7:11 am

Facebook and Twitter said Tuesday that they had removed accounts linked to Russian state actors who tried to spread false stories about racial justice, the Democratic presidential campaign of Joe Biden and Kamala Harris and President Trump's policies.

Researchers who have examined the operation said it attempted to steer left-leaning voters away from the Biden-Harris campaign in an way that echoes the Russian disinformation tactics that sought to depress progressive and minority support for Hillary Clinton in 2016.

"Russian actors are trying harder and harder to hide who they are and being more and more deceptive to conceal their operations," said Facebook's Head of Cybersecurity Policy Nathaniel Gleicher in an interview with NPR. "But there was very little attention paid to this operation."

Facebook said the Russian agents set up a site posing as an independent news outlet and managed to recruit "unwitting freelance journalists" to write stories that were shared by dozens of social media accounts created through artificial intelligence.

On Facebook, the stories from the pseudo news site were posted to groups that appeal to progressive causes.

"It was very much a strongly left-leaning constituency they were aiming at. It looks like that was audience-building," Ben Nimmo, head of investigations at research firm Graphika, which analyzed the operation, said in an interview with NPR. "But there were indeed pieces that said Biden and Harris are much too far to the right."

The Russian operatives, according to Facebook, primarily used a website called PeaceData. It billed itself as a news site that aimed to shed light on corruption, abuse of power and human rights.

Both Facebook and Twitter detected and removed accounts associated with the site before any of them had gathered a large following.

Facebook said it removed 13 accounts and two pages that together had gained 14,000 followers. Twitter said it had suspended five accounts and will continue to block any content connected to the PeaceData website.

"Regardless of the low-level impact in this case, governments around the world must stop these practices. They're antidemocratic. Attempts to manipulate our service to undermine democracy — by both foreign and domestic actors — will be met with strict enforcement of our policies," Twitter said in a statement.

Facebook said its investigation was launched after receiving a tip from the FBI about accounts controlled by the Kremlin-backed Russian Internet Research Agency, which American intelligence agencies have said interfered in the 2016 election to help then-candidate Trump.

While Facebook has drawn criticism for not doing enough to limit the spread of disinformation from accounts associated with conspiracy theories like QAnon, Gleicher said Tuesday's take-down of the Russian-linked accounts shows that partnerships with law enforcement and research groups can pay off.

"We still need to be vigilant, but the whole of society defense, where we have government, civil society and the tech platforms together, is working," Gleicher said.

With the presidential election two months away, social media companies, including Facebook and Twitter, have been under heightened pressure from Congress and outside groups to step up efforts to curb the spread of disinformation, which often moves rapidly across the platforms before moderators respond.

Both Facebook and Twitter have announced new policies, including labeling posts that are misleading or contain manipulated media, attempting to add context to trending stories and, when content brazenly violates their rules, removing the posts altogether.

Social media analysis firm Graphika said the accounts appears to reveal new tactics that Russian operatives are employing on social media ahead of the 2020 election, including the use of artificial intelligence to create social media profiles.

"This is the first time we have observed known [Internet Research Agency]-linked accounts use AI-generated avatars. However, the website employed real and apparently unwitting individuals, typically novice freelance writers, to write its articles," Graphika researchers found in a new report on the Russian-backed accounts.

"The freelancers are really the victims here. They didn't know what they were signing up for," said Nimmo of Graphika, in an interview with NPR. "The youngest were fairly soon out of college," he said. "Some were copywriters from South Asia, but they really came from all over the world, including the U.S."

Between February and August, the website published more than 500 articles in English and about 200 in Arabic that were shared on Facebook, Twitter and LinkedIn, according to Graphika.

The researchers quote one article by a guest writer that accused Biden and Harris of "submission to right-wing populism [...] as much about preserving careers as it is winning votes." Another story, according to Graphika, accused Harris and other Democrats of "deliberately avoid[ing] being held accountable by setting no moral standard for the public to hold them to."

The operation attempted to enlist a left-wing audience and then serve up articles attacking the character and policy positions of Biden and Harris, something Graphika said is consistent "with the original [Internet Research Agency's] attempt to depress support for then-Democratic candidate Hillary Clinton by infiltrating and influencing progressive audiences."

The report supports America's top counterintelligence official, who said last month that Russia is spreading propaganda on social media and on Russian television to try to undercut Biden ahead of the November election.

Copyright 2020 NPR. To see more, visit https://www.npr.org.

RACHEL MARTIN, HOST:

With just weeks left until the election, more evidence is coming out about how Russia is again interfering. Facebook has confirmed that it has removed accounts linked to Russian state actors who were trying to spread false stories. Those stories were aimed at influencing the outcome of the November vote. NPR's tech reporter Bobby Allyn is covering this and joins us now. Good morning, Bobby.

BOBBY ALLYN, BYLINE: Hey, Rachel.

MARTIN: So tell us more. What exactly did Facebook uncover?

ALLYN: So this all started with a tip from the FBI. Federal authorities reached out to Facebook and said, hey, we found this site, peacedata.net, and it says it's an international news site, but if you look very closely, it sure does look like a Russian propaganda tool. So Facebook looked into it and, indeed, discovered that it was linked to Russian operatives, and it was sharing hundreds of bogus news articles about everything from racial injustice to the Democratic campaign of Joe Biden and Kamala Harris.

I talked to Ben Nimmo of research firm Graphika. They collaborated with Facebook on looking into this website. And Nimmo told me that the Russian operatives who were running it were posting articles on Facebook to groups liked by progressives.

BEN NIMMO: It was very much a strongly left-leaning constituency that they were aiming at. But in among there, there were indeed pieces which were saying, well, Biden and Harris, they're much too far to the right.

MARTIN: So they were trying to make progressives less likely to be supportive of the Biden-Harris ticket. How does that compare, Bobby, to what happened four years ago?

ALLYN: Researchers say, you know, this operation both echoes the 2016 playbook and introduces some new elements. So four years ago, Russian troll farms pushed false stories to suppress the progressive and minority vote to try to hurt Hillary Clinton. We're seeing that tactic again. It's a similar goal. What's new here is they duped Americans into helping them seem more credible. They posted writing gigs on hiring boards in the U.S. telling, you know, young and inexperienced journalists that, hey, if you want to make some extra money, you could come write for peacedata.net.

Here's Nathaniel Gleicher. He heads cybersecurity policy at Facebook.

NATHANIEL GLEICHER: And they used that to reach out to unwitting freelancers to essentially trick them into writing for this fake organization and writing on topics that the Russian actors wanted them to write on.

ALLYN: The thing is, Rachel, it didn't quite work. Facebook and Twitter both caught this very early on, and these pages never really gathered the reach that the Russian operatives had hoped. So the tech companies are saying, look - this is a success story.

MARTIN: Can they really bask in the success of this, though? I mean, there's all kinds of other disinformation on Facebook.

ALLYN: Yeah, that's right. I mean, it was identified and whacked before it reached millions of people. That is a good thing, right? But we know that Facebook has trouble controlling its platform, frankly. Whether it's violent militia groups that go there to organize or QAnon conspiracy theories, there's loads of troubling and, frankly, sometimes dangerous content on the platform. And sometimes that stuff, it slips through the cracks.

MARTIN: So they have been paying attention to this, though. They've talked a lot about their efforts to better moderate the content on their platform. Is any of that working?

ALLYN: You know, it's a constant game of cat and mouse. In 2016, the Russian meddling was so impactful because the Russian troll farms were able to build audiences over time, over many, many, many months. Facebook's Gleicher says Russians have some new tricks up their sleeve, but so too does Facebook.

GLEICHER: Russian actors trying harder and harder to hide who they are and being more and more deceptive to conceal their operations.

ALLYN: Yeah, so Facebook only started this investigation after the FBI told them about it. And before then, these pages were sharing on Facebook for three months, which in Facebook time is a very, very long time.

MARTIN: NPR's Bobby Allyn. Thank you so much for your reporting on this. We appreciate it.

ALLYN: Thanks, Rachel. Transcript provided by NPR, Copyright NPR.