Facebook executive Andrew Bosworth offered a fresh defense of the company’s role in the spread of misinformation during an interview with Axios that aired Sunday. If you have a problem with users believing misleading COVID-19 content, Bosworth said, “you have an issue with those people,” not Facebook.

The remarks, which drew immediate backlash, are the latest example of the tech giant taking a defiant tone in the face of criticisms over its safety practices, rather than offering apologies as it did in the wake of prior scandals.

“Mr. Bosworth seems to be saying the quiet part out loud: that Facebook apparently doesn’t see itself as responsible for spreading the misinformation and disinformation that their company profits off of,” Sen. Mark Warner, D-Va., told The Washington Post.

But the argument also bore a striking resemblance to timeworn logic offered in defense of a wholly different industry: the “guns don’t kill people; people kill people” catchphrase, often deployed by the National Rifle Association.

While the products themselves are poles apart, leaders from both industries have used similar rationales when discussing the responsibility they bear when their tools cause harm. In the case of online misinformation, Bosworth argued, that burden ultimately falls on users.

“Individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing,” said Bosworth, who next year will become chief technology officer for Facebook parent company Meta.

Advertising

It’s not the first time the company has made that argument.

“Responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them,” Facebook spokeswoman Dani Lever said in a statement amid accusations the company allowed former President Donald Trump’s supporters to incite violence on its platforms before the attack.

The company repeated the stance in June in response to a call by its oversight board for an internal review of how the platform may have contributed to the violence on Jan. 6.

But the pointedness of Bosworth’s latest comments struck a nerve.

The problem with his argument, critics said, is that it disregards the power Facebook and other social media companies have to influence how content is disseminated — and, in some cases, to prevent its platforms from being weaponized.

Blake Reid, a tech and telecom policy professor at the University of Colorado Law School, said it’s not fair to describe social media platforms as “a mirror to society” that just reflects users’ harmful behaviors onto the web, because they play a major role in shaping our discourse.

“Facebook is structuring a platform for how those communications happen,” including with the decisions it makes about what content to recommend and amplify, he said in an interview.

Advertising

Facebook declined additional comment on Bosworth’s remarks and comparisons between his arguments and the NRA catchphrase.

While the link between online misinformation and real-world harm is far more difficult to trace than the trajectory of a bullet, Facebook and other social media platforms have been repeatedly accused of contributing to violence or even death.

President Joe Biden accused Facebook in July of “killing people” by allowing bad actors to spread misinformation about coronavirus vaccines, though he later dialed back his criticism. The White House has also fumed that the company hasn’t been more forthcoming with data about the spread of COVID-19 misinformation on its sites, as my colleagues reported in August.

After Biden criticized Facebook’s efforts to stamp out COVID-19 misinformation, the company pointed to its initiatives aimed at directing users toward authoritative sources of medical information, as well as its expanded policies against misleading coronavirus content.

Bosworth touted those and other efforts as perhaps “the biggest COVID vaccine campaign in the world” in his interview, arguing the company is taking on responsibility.

Asked if the company can strike a balance between limiting the misleading information people see and making it less easy to spread misinformation, Bosworth replied, “We’re doing that. We are on the middle path. You just don’t like the answers.”

Advertising

He added, “But at some point, the onus is and should be in any meaningful democracy on the individual.”

Bosworth, one of Facebook’s more provocative leaders, doesn’t speak for the whole of the company as just one of its executives.

Reid, the law school professor, said while Facebook is no “monolith,” it’s concerning to see someone of his stature stake out such a position.

“It’s really worrying to have someone at the senior echelons of the company who’s responsible for setting the tone both publicly and no doubt internally in guiding how the technology is shaped and what decisions are made around who’s taking this worldview,” he said.