On Jan. 6, as an angry mob stormed the U.S. Capitol, President Donald Trump posted on Facebook that his supporters should “remember this day forever.”
“These are the things and events that happen when a sacred landslide election victory is so unceremoniously & viciously stripped away from great patriots who have been badly & unfairly treated for so long,” he said in a post.
In response, Facebook did something it had resisted for years: banned Trump’s account indefinitely for inciting violence. Twitter, YouTube and others followed suit.
The ban is that culmination of a long-running and tortured relationship between the politician and the social media company, one that will hit a new inflection point on Wednesday. That’s when a Facebook-funded panel of experts will announce whether Facebook must reinstate Trump’s account. The impending decision by the Oversight Board, a less than one-year-old body that describes itself as an “experiment” in the regulation of online speech, could be the most consequential decision ever regarding free speech on social media, according to experts. It could also alter the way that social media companies treat public figures going forward.
Trump’s social media megaphone during his presidency helped transform Facebook, as the company sought ways to survive a hostile political climate and accommodate the boundary-pushing president. Facebook made numerous concessions, including a “newsworthiness” carve-out that exempted political figures and other leaders from its hate speech rules and an explicit policy not to apply fact-checking to political leaders.
The newsworthiness exemption was initiated in response to Trump’s use of inflammatory language during his first presidential campaign, The Washington Post reported, and in the years that followed, it guided Facebook’s approach to political leaders and reshaped the world’s information battlefield.
For years critics asked Facebook to ban Trump, citing his frequent promotion of misinformation and extreme rhetoric. But Facebook chief executive Mark Zuckerberg had long felt that the public needed to hear what politicians had to say, as long as troubling comments fell short of violence, according to his public statements. He also thought Facebook shouldn’t be in the role of making such consequential decisions and started to create plans for the board in 2018.
But the events at the Capitol in January were the last straw.
The Oversight Board is evaluating the determination — which Facebook says was made during extenuating circumstances — at the company’s request. Facebook says the rulings of the independent, 20-member body are binding. The company does have a hand in picking board members, which include a Nobel laureate and a former Danish prime minister, and paying them through a separate trust.
“This is just the start of an experiment, but it can’t be where it ends.” said Evelyn Douek, a lecturer on free speech issues at Harvard Law School. “In some sense, we are all playing Facebook’s game by taking the Board seriously as a legitimate institution. On the other hand, no one has a better alternative right now.”
More than a decade ago, Facebook began courting politicians and candidates to show them how to build an audience and reach voters on the service, according to public records and interviews with former employees and previous reporting.
Though the company had worked with Trump previously in its celebrity partnerships division, things shifted when he emerged in 2015 as a longshot presidential candidate. In December of that year, he posted a viral video in which he said he wanted to ban all Muslims from the United States.
In a high-level meeting to discuss the post, senior executives, including Zuckerberg, said they thought it went against the company’s hate speech guidelines and several wanted to remove it, according to several people familiar with the meeting who spoke on the condition of anonymity to discuss internal deliberations.
But after some debate, executives decided to create a blanket allowance that newsworthiness would be taken into account when making calls about whether certain posts violated Facebook’s hate speech rules.
When the “newsworthiness exemption” policy was formally announced the following October, the company did not discuss Trump’s role in shaping it.
Facebook has disputed the policy was originally designed for Trump, instead saying it was developed in 2016 in response to the site mistakenly removing a historic Vietnam War photo that contained child nudity.
In the ensuing years, those debates about newsworthiness informed the company’s approach to content from political leaders around the world, former employees say.
Formally, Facebook says the carve-out was used just six times. In 2018, for example, the company restored a video by the top aide to Hungarian Prime Minister Viktor Orban in which he blamed immigrants for pushing out “white Christians.”
But some internal critics said Facebook went too far, including by protecting conservative publishers who were Trump allies and waiving its typical three-strikes penalty system for misinformation for a pro-Trump super PAC and for the president’s eldest son, The Post reported last year.
Elizabeth Linder, a former Facebook executive whose job was to facilitate ties to political leaders and candidates in Europe and the Middle East until she resigned in 2016, said courting high-profile users and prioritizing helping those users gain a vast audience above others was problematic. It created a dynamic that meant it was sometimes difficult to hold them accountable.
“Social media companies should be on the side of people versus the side of individual political leaders,” she said, adding that the company’s approach to this was flawed.
In the run-up to 2020, Twitter started rolling back some of its own public-interest carve-outs, adding labels to political speech that broke its rules. Facebook went the opposite direction, announcing that it would not fact-check politicians in their ads.
As the 2020 election got underway and the United States was hit with a devastating pandemic, Trump began to flood the zone with more misinformation than before, according to The Washington Post’s Fact Checker.
He falsely claimed the Centers for Disease Control and Prevention was exaggerating the death toll from the coronavirus. He posted that the high U.S. caseload could be blamed on frequent testing and that “the Mortality Rate for the China Virus in the U.S. is just about the LOWEST IN THE WORLD!”
There is no evidence that the CDC exaggerated COVID-19 deaths, according to Anthony S. Fauci, the director of the National Institute of Allergy and Infectious Diseases. Fact-checkers have found Trump’s claims regarding the CDC and testing to be false. The United States had the sixth-highest death rate from COVID-19 in the world at the time of his comment.
Facebook left up 1,440 posts by Trump containing misinformation or extremist rhetoric last year, according to analysis by the left-leaning advocacy group Media Matters. The group has argued to the Oversight Board that Trump’s account should not be reinstated. The social network appended generic labels to approximately 500 of those posts, directing people to authoritative information. Just one was rated false.
Facebook said in response to the Media Matters report that not all forms of misinformation related to the election or COVID-19 were banned by the company and that Facebook removed Trump’s posts in the handful of instances where executives found they violated the social network’s policies.
In May 2020, amid protests in Minneapolis following the death of George Floyd, Trump used a segregationist’s reference to suggest that the military might shoot demonstrators who broke the law.
Though civil rights advocates and thousands of employees said the post threatened violence, Zuckerberg’s team first made a phone call to the White House asking Trump to soften or take down the post, The Post reported last year. Trump later backtracked on some of the comments, and the chief executive left the post up, arguing to staff that it could have been in the public interest because it was meant as a warning.
After the election, Trump posted misinformation 363 times on Facebook from Jan. 1, 2020, to Jan. 6, 2021, the period examined by Media Matters, including false claims of victory and voter fraud, false claims about the voting infrastructure company Dominion Voting Systems and false claims that the election was stolen from him in which he used the phrase “Stop the Steal” — the rallying cry of the Capitol rioters.
None of these posts were removed. Facebook appeared to remove just seven posts by Trump in all of 2020, four of which were for copyright issues, according to a Washington Post review.
On Jan. 6, Trump posted a video in which he said his supporters were “patriots” and were “special.” He tweeted that they would not be “disrespected or treated unfairly in any way, shape, or form.” He also tweeted that he would not be attending President-elect Joe Biden’s inauguration, breaking a tradition held by almost all presidents until then.
First Facebook and then Twitter suspended Trump’s account that week on the grounds that those comments were encouraging or inciting further violence and lawbreaking to delegitimize the election — or worse, to conduct an attack on the inauguration itself.
“The shocking events of the last 24 hours clearly demonstrate that President Donald Trump intends to use his remaining time in office to undermine the peaceful and lawful transition of power to his elected successor, Joe Biden,” Zuckerberg posted on his Facebook wall on Jan. 6. He noted that in the past, the company had allowed Trump to use the platform because “we believe that the public has a right to the broadest possible access to political speech, even controversial speech.”
“But the current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government,” he wrote.
The next day, Facebook said the ban would continue indefinitely. Twitter followed Jan. 8 with a permanent ban.
Two weeks later, Facebook referred its determination to the board.
Silicon Valley experts are mixed on the potential outcome. Some think Trump’s account will not be reinstated because he went too far. Others expect Trump will be allowed back on, noting that Facebook and others said they were making an immediate decision based on extenuating circumstances that have now passed.
Most expect the decision to come with strongly worded recommendations that, if adopted, could reshape the way speech by public figures is moderated online — and how the line between speech and violence is drawn.
Restoring Trump’s account would send the message that “there is little a public official can do that would warrant their removal,” said Danielle Citron, a free speech expert at the University of Virginia School of Law. Trump went too far, and “it would be performative nonsense if they reinstated him.”
That could have serious consequences for world leaders’ use of the platform as a whole.
Katie Harbath, a former Facebook public policy executive who created the company’s original partnerships with politicians, and a fellow at the Bipartisan Policy Center, a Washington, D.C., think tank, said if Trump is permanently banned, the company will be flooded almost immediately with questions about other world leaders.
“The decision is the beginning, and it’s a milestone in this conversation,” she said. “But it won’t give us all the answers.”