Policing hate speech is only part of the problem for tech platforms. The broader problem is policing truth and its ugly converse: disinformation.
“As the poem goes, you know,” said Sen. Ted Cruz, R-Texas, “first they came for Alex Jones …”
That is, of course, not how the poem goes. In an irony to end all ironies, Cruz was using the oft-invoked verse about the Holocaust to defend an anti-Semitic conspiracy-monger. But the senator’s words do throw into relief the reasoning of those distressed that Facebook and many other platforms removed Infowars from their sites beginning late Sunday: These critics worry about a slippery slope of censorship. And the way tech companies have gone about booting peddlers of disinformation from their sites does little to assuage those misgivings.
Infowars hasn’t changed much since it first started warping the web nearly two decades ago, and it has changed even less in the past month. What has changed since Facebook chief executive Mark Zuckerberg’s mealy-mouthed rationalizations for keeping Jones on his site just weeks ago is the amount of pressure the media and politicians have put on tech companies to clean up after themselves.
That pressure was enough to prompt Apple to remove almost all Infowars podcasts from its platforms, and Apple’s decision was enough to prompt Facebook to do the same. Facebook’s decision was enough to prompt YouTube, and somewhere along the way Spotify and Stitcher signed on to the mass exorcism. And then there was one: Twitter, so far, looks unlikely to reverse course. Infowars’ native apps at iOS and Android app stores are also still available for download.
Apparently, all it takes for most of today’s guardians of the digital galaxy to drive an offending outlet toward obscurity is for one of them to make the first move. That illustrates the immense influence a single action by a single company can have on the online landscape. But it also demonstrates the reluctance of each company to take those actions without cover.
Cover, after all, has been the general theme of the Infowars imbroglio. Amid the applause these sites are receiving for finally drawing a line on false content, their executives are actually refusing to admit they’re drawing a line on false content at all. Instead, they’re relying on pre-existing rules around hate speech, harassment and violence to de-Jonesify their platforms.
“Apple does not tolerate hate speech,”said Apple. “When users violate … our policies against hate speech and harassment … we terminate their accounts,” said YouTube. Infowars was “glorifying violence” and “using dehumanizing language” against minorities, said Facebook.
Those rules are important, and they’re under-enforced. But policing hate speech is only part of the problem for these tech platforms. The broader problem is policing truth and its ugly converse: disinformation.
In the case of Infowars, only Twitter, the lone holdout among the tech titans, has uttered the word out loud: “truth,” if only to disclaim any duty for umpiring it. Twitter executives claim that is not their role. Its competitors evidently disagree — the conversation about Infowars has always been focused on how Jones makes it his mission to fill the internet with falsehoods — but they aren’t ready to say so. Facebook, the most explicit about refusing to be explicit, insisted in its statement announcing the purge that “while much of the discussion around Infowars has been related to false news … none of the violations that spurred today’s removals were related to this.”
Whether this is even politically intelligent is unclear. Setting a clearer standard on disinformation would invite a torrent of criticism, but now that these companies have removed Infowars, they are due for a reckoning someday. They might as well dislodge themselves from their collective crouch and start running offense.
It would mean recognizing the enormous amount of control they have over the online ecosystem, as well as detailing how they plan to exercise that influence. It means determining a robust framework for removing disinformation (beyond what comes from Russian trolls), and articulating what that framework looks like. It probably also means staffing up to make sure that plan operates well in practice, and perhaps even creating appeals processes for those dissatisfied with the companies’ decisions.
So far, there has been a lot of power at play here, and a lot less responsibility. Accepting more of the latter is the only way Facebook and its cohort can defend what they’re doing with any coherence. It’s also the only way they can lay down salt on the slippery slope that has critics so concerned.