© 2024 WYPR
WYPR 88.1 FM Baltimore WYPF 88.1 FM Frederick WYPO 106.9 FM Ocean City
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Ex-Facebook employee says company has known about disinformation problem for years

Yaël Eisenstat, pictured in September, says Facebook leaders have known about problems with disinformation and how Facebook incentivizes the most extreme voices.
Riccardo Savi
/
Getty Images for Concordia Summit
Yaël Eisenstat, pictured in September, says Facebook leaders have known about problems with disinformation and how Facebook incentivizes the most extreme voices.

Updated October 26, 2021 at 11:15 AM ET

Facebook is on the defensive after a whistleblower leaked thousands of documents showing how the company failed to control the spread of false information and lies about the 2020 election.

In an internal report, Facebook staff concluded that the company "helped incite the Capitol Insurrection." Leaked documents show how the company was unable to stop the growth of pro-Trump "Stop the Steal" groups and conspiracy theories.

Facebook denies that it was responsible for the Jan. 6 riot and said it "took steps to limit content that sought to delegitimize the election."

Yael Eisenstat spent six months at Facebook in 2018 as the global head of elections integrity operations for political advertising. She's now a Future of Democracy Fellow at the Berggruen Institute.

"This is something that was years in the making," she tells NPR's Morning Edition. She says experts, including people inside the company, have been warning for years about how Facebook's software incentivizes disinformation and the most extreme voices. She says Facebook executives were aware of the extent of the problem.

Facebook spokesperson Drew Pusateri told NPR that the company "took steps to limit content that sought to delegitimize the election, including labeling candidates' posts with the latest vote count after Mr. Trump prematurely declared victory, pausing new political advertising and removing the original #StopTheSteal Group in November."

The company says after the Jan. 6 riot, it "removed content with the phrase 'stop the steal' under our Coordinating Harm policy and suspended Trump from our platforms."

Facebook also says that politicians are required to follow community standards and advertising policies and that it has enforced rules against politicians including Trump, who is suspended until at least January 2023, and Brazil's Jair Bolsonaro, whose video with misinformation about the coronavirus was removed last year.

Interview highlights with Yael Eisenstat have been edited for length and clarity.

Interview Highlights

On how Stop the Steal and misinformation grew so quickly after the election

For years, people — whether it was people like myself, people who work in the company, researchers, journalists, outside academics — have been warning the company that the way their product is working, the way they amplify certain kind of content, the way they recommend people into certain groups, was causing a situation that was already amplifying some of the most extreme voices.

But then you couple that with what I believe was one of their most dangerous political decisions the company made — this was actually around the time that I was working there — to not fact check political actors, to allow some of the biggest voices with the biggest platforms to violate [Facebook's] own policies, these combined to make this perfect storm.

So when they talk about all the measures that they put in place to protect the elections, that doesn't override all the things that they were not doing, such as really making sure that groups weren't engaging in this kind of activity, making sure that certain political elites were not using the platform to spread lies about the election. This was going on for years. It's not something they could have stopped suddenly in three days after an election.

On how Facebook is designed for growth above all else

Content moderation — what to take down and what to leave up — is extremely difficult. And yes, that is something that is on the platforms to handle because that is definitely not government's area. But how your platform is designed, how you're monetizing it, what you're allowing and what you're not: basic guardrails, that's what's really missing here.

If you for years have allowed an environment where you're not holding politicians to the same standards that you're holding the rest of us to on your platform because really you want to preserve your power and you're not reining in these algorithms and recommendation engines that are pushing people to a lot of this content ... that's really about not wanting to harm your own growth. So you're not reining in how your recommendation engines are pushing people into some of these groups.

Those two things combine around a time where we have a really volatile situation in the U.S. around our election. It's a perfect storm. And let's be clear, Facebook had years to fix this. A lot of the documents are focusing on 2019 and 2020, but this is work that many people have been trying to get them to do — including my team when I was there — for years.

On what happened when she raised concerns

In 2018, the very first thing I tried to do was ask why we were not putting any sort of fact-checking standards into political advertising. It was very clear to me that advertising — may be not the most important thing on the platform — but it's paid speech. We're putting labels on the ads to make them look even more credible, and we're giving the political actors targeting tools to target people with this messaging. And if we weren't even fact-checking that and we were allowing politicians to use advertising to sow different messages to different groups, that was dangerous. ...

I was pushed out for these kinds of ideas. And for the ideas to try to build in voter suppression plans into political advertising. There just wasn't an appetite from leadership for that.

Lilly Quiroz and Jill Craig produced and edited the audio interview. James Doubek produced for the web.

Editor's note: Facebook is among NPR's recent financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Rachel Martin is a host of Morning Edition, as well as NPR's morning news podcast Up First.