Facebook, Twitter and other social-media platforms were right to at least temporarily ban President Donald Trump.
The danger of Trump’s incendiary rhetoric and outright lies about election fraud was abundantly clear last week, after he goaded a mob that stormed the U.S. Capitol and interrupted the peaceful transition of power.
This does not absolve platforms of their culpability, however, in amplifying, distributing and normalizing abhorrent and corrosive content for years.
Nor does this let off the hook Trump’s allies and enablers, including many Republican leaders. They are complicit for standing idly by as the president spent four years laying kindling of falsehoods and pouring gasoline of resentment, which he ignited last week.
Yet Americans and policymakers should be wary of oversimplifying the problem as one of social media failing to rein in certain individuals.
Trump was able to stoke revolt by exploiting deep divides in a country that’s losing common ground, including the shared knowledge of current affairs that local newspapers used to provide.
Fixing that will require a multifaceted response, including stabilizing what’s left of the free press.
Digital reforms also are needed.
Addressing rampant misinformation on major digital platforms, and their profound and persistent failure to adequately moderate and curate such material, must be part of the state and federal antitrust cases now underway, and reform efforts percolating in Congress.
This also will renew debate over Section 230, a telecommunications law that offers protections from liability. One intent of the law was to encourage websites to moderate objectionable content themselves, a task that the biggest platforms seem to do mostly after damage is done.
Democrats have sought Section 230 changes to limit speech to which they object, while Republicans have pushed for Section 230 changes in hopes of reducing what they perceive as censorship by left-leaning tech platforms.
Section 230 needs to be updated. But it’s only part of the broader regulatory reform that’s needed to address the outsized power and unfair business practices of digital platforms.
Enforcing existing laws is also needed, said Rebecca Tushnet, Frank Stanton Professor of the First Amendment at Harvard Law School. She contends jailing those who carry out violent acts will do more than encouraging platforms to suspend more people.
“Rather than play whack-a-mole, we need to focus on the fact we have a powerful right-wing extremist movement, and we need to think very hard about the appropriate law enforcement response to it,” she said.
There is no easy solution.
It was obvious after the fact that Trump’s postings went too far, but there’s also danger in censorship. The government — or even worse, giant technology companies — cannot dictate what’s truthful or acceptable political speech in the public commons. It’s ultimately up to the people to decide.
Trump and his allies had the right to protest, scrutinize and challenge the initial results.
Yet Trump’s constant braying about the election being rigged — starting before it happened, and despite his 2016 election being demonstrably tainted by Russian influence — was both false and caustic.
It was an easy call for Facebook and Twitter to shut down Trump’s account once he incited violence. Hindsight is 20/20.
The harder question is deciding beforehand when a president, another elected official or anyone else, has crossed the line.
We can’t trust the federal government to do this well, especially not after last week’s performance. It ignored or failed to act on clearly dangerous postings, on both mainstream and obscure websites, rallying and organizing attacks on the Capitol. These were reported to the FBI and other authorities, but they failed to adequately respond and prepare on the ground, much less online.
So far we can’t trust social media companies to do it right, either. While it’s a relief to many that they found the Trump mute button, they failed to do so when he made other threats, such as a nuclear attack on North Korea.
Last-inning attempts by Twitter and Facebook to provide context or warnings on certain posts was a start.
Their moderation was inconsistent and raised questions about political intent by doing this near an election.
But these small steps toward responsible stewardship of their great power are positive and should be built upon. Platforms seeking to assume the role of traditional media need standards and a true commitment to do public good.
That’s a small start on the much broader effort that’s needed to address a crisis of misinformation that’s widening divides and undermining not just democracy but civic life in America.