Updated at 12:59 p.m. ET
The leaders of the Senate intelligence committee vented Wednesday morning about what they called the shortcomings of major tech firms, which they argue have led Americans to become vulnerable to foreign disinformation campaigns.
"What is under attack is the idea that business as usual is good enough. The information your platforms disseminate changes minds, hardens opinions, helps people make sense of the world. ... That's how serious this is," said Chairman Richard Burr, R-N.C.
Facebook Chief Operating Officer Sheryl Sandberg and Twitter CEO Jack Dorsey testified about the work they're doing.
Both Burr and his Democratic counterpart, Sen. Mark Warner, lamented that Google did not send an appropriately senior executive to appear — the company had offered to send its chief legal officer, Kent Walker, but a bipartisan consensus on the committee was that he was insufficient.
"I'm deeply disappointed that Google, one of the most influential digital platforms in the world, chose not to send its own top corporate leadership to engage this committee," Warner said, ticking off how Google search, YouTube and Gmail, which it owns, were relevant to discussions about foreign influence operations.
Warner had held out the possibility as late as Tuesday afternoon that Google could still change its mind, but the company did not.
Google posted a copy of what it called Walker's prepared testimony on Tuesday.
In it, he hit the notes that other tech bosses sounded in their hearing: They appreciate the importance of fighting active measures, they're tightening their internal security and other practices, and they're prepared now in ways they weren't for the big influence campaign of 2016.
Dorsey and Sandberg both said how badly they were caught off guard by foreign influence operations in the last presidential election campaign.
"We were too slow to spot this and too slow to act. That's on us. This interference was completely unacceptable," Sandberg conceded.
Some use of Facebook by automated or fake accounts continues. About 3 to 4 percent of accounts on Facebook's platform today are "inauthentic" users who do not represent real people, Sandberg said.
With more than 2 billion users, that suggests there remain tens of millions of such accounts.
Dorsey, meanwhile, said he wants Twitter to be a "public square" where almost anyone can say almost anything, but that he also recognized how much it has been exploited.
"We aren't proud of how that free and open exchange has been weaponized and used to distract and divide people, and our nation," he said. "We found ourselves unprepared and ill-equipped for the immensity of the problems that we've acknowledged."
Republican Sen. Marco Rubio of Florida also questioned the duo about their operations outside the United States.
"These principles of our democracy, do you support them only in the United States? Or are these principles that you feel obligated to support around the world?" Rubio asked.
"We support these principles around the world. ... We would only operate in a country when we can do so in keeping with our values," Sandberg replied.
Citing cases in Turkey, Rubio asked Dorsey: "How does blocking the accounts of journalists and an NBA player [keep] with a core tenet of freedom of expression?"
"We would like to fight for every single person being able to speak freely, and to see everything, but we have to realize that it will take some bridges to get there," Dorsey responded.
"This is fine"
Wednesday's hearing was the second time this year that the intelligence committee has convened a public session on how social media platforms can be used to further foreign influence operations.
The committee also heard from social media experts back in August.
"Some feel that we as a society are sitting in a burning room, calmly drinking a cup of coffee, telling ourselves, 'This is fine.' That's not fine," Burr said in August.
"We should no longer be talking about if the Russians 'attempted' to interfere with American society. They've been doing it since the days of the Soviet Union, and they're still doing it today."
The past six months have been eventful as social media companies and tech giants have repeatedly announced the disruption of various influence operations.
In late July, Facebook announced that it had removed 32 accounts involved in a political influence campaign with links to the Russian government.
Microsoft said it discovered and stopped an attempted cyberattack tied to Russia.
Then Facebook made another announcement: It had shut down hundreds of accounts linked to an Iranian-backed global disinformation campaign. Twitter then followed suit by deleting accounts linked to this Iranian campaign.
In the midst of all these alleged information operations, the Democratic National Committee reported that it had stopped what appeared to be an attempted cyber intrusion into its voter data systems.
But what looked like an attack was actually a security test by friendly volunteers for the Michigan Democratic Party who didn't communicate that with the main DNC.
The Democrats' cybersecurity specialists have compared themselves to combat veterans suffering from post-traumatic stress following the deep embarrassment to the party after it was the victim of Russian attacks during the presidential election.
"I think we all still have PTSD from 2016," Raffi Krikorian, chief technology officer at the Democratic National Committee, told NPR.
The miscommunication within the DNC showed the current state of alarm: Big tech, social media companies and politicians are on red alert about the issue of foreign intrusions and cyber operations.
RACHEL MARTIN, HOST:
Top executives from Facebook and Twitter are here in town today with some important meetings. They're going before House and Senate committees on Capitol Hill, and they will be grilled again about what social media companies could have done to stop Russian interference in the 2016 U.S. election and what they're doing now about Russian propaganda ahead of the midterms. NPR's Tim Mak will be covering the hearing, and he's with us in the studio. Hey, Tim.
TIM MAK, BYLINE: Hey there.
MARTIN: So as I mentioned, these companies have been in this situation before answering these kinds of questions. In the past, they've been reluctant to say they could have done more to prevent their platforms from being abused. Are they - do you expect them to say something different today?
MAK: Yeah. I think there's an increasing sense amongst big tech firms that this is a serious problem, that an open society and social media platforms can be actually used against American democracy. Facebook, for example, is expected to emphasize how they're doubling the number of people working on safety and security issues to more than 20,000. And they're going to talk about how they're collaborating with law enforcement increasingly in the months ahead. But there's one firm that's not going to be there at all, and that's Google, which has declined to send founder Larry Page or its CEO to appear before the Senate Intelligence Committee today. And in his place, there's going be a little chair there, a little sign - a bit of Washington theater - as there is bipartisan consensus from Republicans and Democrats on this committee that their offer, Google's offer, to send its chief legal officer is insufficient.
MARTIN: I mean, this is really what the central tension has been about, how these companies define themselves, right? Are they just a platform for people to express their views, or do they have a responsibility to curate them? How did they come down on this?
MAK: Yeah. I mean, Facebook in particular is trying to make a good faith effort towards transparency and addressing some concerns from lawmakers and the public. And a lot of this is driven by the changing nature of the threat and how serious it now appears and has become. There's no sign that this threat is diminishing. In fact, just in the past month, we had Microsoft, Facebook and Twitter all announce that they had discovered and disrupted foreign operations. Iran was revealed to be a new player in worldwide disinformation campaigns. And on top of this, we found that there was a new Russian disinformation campaign and apparent hacking attempts. Big tech firms are really on red alert.
MARTIN: So these threats keep popping up and these companies say that they're equipped to manage them.
MAK: Well, that's the thing about it. Some of these issues can be addressed by the companies themselves. Some of them can be addressed by regulations that Congress might propose, such as online ad disclosures or things like new sanctions against foreign actors. But a lot of this has to do with people themselves. It's a multidimensional threat. It involves everything from hacking emails to bot networks online to disinformation campaigns to election infrastructure security. And a lot of this can be dealt with folks who are just listening to the radio right now, and that's doing things like securing their emails with multifactor authentication and being super judicious about the kinds of information they view online, whether or not they double check their sources, whether or not they're looking at news sources that have credibility and a long track record of being established. It's a really complicated problem. It involves state governments, big tech firms, Congress and people being responsible themselves.
MARTIN: And us. You're saying it's on us. All right. NPR's Tim Mak, thanks so much. We appreciate it.
MAK: Thanks a lot. Transcript provided by NPR, Copyright NPR.