After months of back and forth with its own Oversight Board, Facebook on Friday announced that former President Donald Trump would remain banned from the site for at least two years.

The timeline was part of a longer reply to recommendations from the board about how Facebook could better handle political figures and newsworthy posts. Of the 19 recommendations set forth by the Oversight Board one month ago, Facebook says it will fully or partially implement 15 of them.

The group of 20 experts from around the world was created and funded by Facebook to help it handle difficult issues, and to show concerned lawmakers that it could self-regulate. The board’s decision on Trump last month was its most consequential to date.

Here’s what you need to know about Facebook’s response.

1. Trump is still banned from Facebook, but only for two years

The company banned Trump from Facebook and Instagram “indefinitely” in January, leaving up his pages on both sites but blocking him from accessing or posting to them. In its decision last month, the Oversight Board agreed with the ban but took issue with how Facebook handled it and asked for clarity over how long it would last. So the company added new rules for how it handles public figures, and applied them retroactively to Trump’s account.

FILE – In this Feb. 28, 2021, photo, former President Donald Trump speaks at the Conservative Political Action Conference (CPAC) in Orlando, Fla. Multiple people who have spoken with Trump and his team in recent weeks say they sense a shift, with the former president increasingly acting and talking like he plans to mount another White House bid. (AP Photo/John Raoux, File) WX202 WX202
Facebook to keep ban on Trump for 2 years, then will reassess
Advertising

Facebook’s solution is to make the ban last for two years, meaning in 2023, the year before the next presidential election, Trump could be allowed back on Facebook. There are some caveats. At the end of the two-year period, Facebook will consult with “experts” about letting him back online. It will weigh aspects like risks to public safety.

If Trump’s suspension is ended, he will essentially be on probation and subject to strict punishments if he violates the social networks rules again. In that case, he could face having his pages removed permanently.

2. Facebook says it will be more transparent about the consequences for breaking its rules

The most concrete change Facebook made Friday was outlining some specific consequences for those who break its content policies.

The Oversight Board criticized the company for making arbitrary decisions rather than establishing a set of transparent rules for all users, and Facebook said it’s responding to that.

It will shed light on its strike policy, which was previously opaque on what users needed to do to get kicked off. Now, breaking policies will lead to progressively longer bans. One strike merits a warning, while two strikes will block the user from posting to the site for one day. The strictest penalty is a 30-day ban for five or more strikes.

Advertising

This system is significantly less strict than YouTube’s strike system, which has been public and operational for years. YouTube bans accounts permanently for getting three strikes within 90 days. But even under the new system, breaking Facebook’s rules won’t automatically result in a strike.

“Whether we apply a strike depends on the severity of the content, the context in which it was shared and when it was posted,” the company said in a blog post about the new rules.

3. Politicians and public figures will face more scrutiny from content moderators

Much of the conversation around content moderation and social media has centered on politicians using the platforms to intimidate opponents and rally their followers, sometimes into violent action. Trump’s encouragement of the Jan. 6 rioters was what got him kicked off the platform in the first place.

But other public figures around the world, such as Philippine president Rodrigo Duterte and former Ecuadorean president Rafael Correa have also used Facebook in ways that human rights organizations say have led to violence.

Facebook has defended leaving up some posts from politicians under its “newsworthiness exemption” — saying that the public has a right to know what their leaders are posting. But now, the company says it will institute harsher rules for public figures who incite violence during times of civil unrest.

Advertising

“Public figures often have broader influence across our platforms; therefore, they may pose a greater risk of harm when they violate our policies,” the company said. A public figure is anyone holding state or national office, political candidates, accounts with over a million followers and those who “receive substantial news coverage.”

4. The newsworthiness exemption still exists, and it’s still vague

Facebook and other social media platforms have generally allowed posts to break their rules against violence or nudity if they are newsworthy, such as a documentary or news report about a war or a police shooting. Facebook has also applied this to posts from some politicians — though it hasn’t said which ones and when.

The company actually told the Oversight Board that it hadn’t applied it to Trump’s posts, but it walked that back on Friday by saying it “discovered” that it had done so in 2019 on a video Trump posted of one of his rallies. The Washington Post previously reported that the exemption was developed in response to Trump.

Facebook said it will take a variety of factors into account when making exceptions for newsworthiness, such as whether it helps other people avoid danger, or what the political situation is in the country at the time of the post. Newsworthy content can still be taken down if it could lead to violence or other kinds of harm, Facebook said.

5. What does this mean for the other big social media platforms?

Facebook’s decision was being closely watched because of its status as the world’s largest social platform, and one that has played a key role in the rise of Trump and other leaders around the world.

Sponsored

Spokespeople for Google’s YouTube have said the company makes its own decisions about content moderation and wouldn’t be pushed in either direction based off the decision made by Facebook, or recommendations made by its Oversight Board.

Still, because of Facebook’s size, it’s feasible that its position could set a norm for other companies. Twitter, which Trump used daily during his presidency, permanently banned him rather than the indefinite suspensions Facebook and YouTube opted for.

For YouTube, there could now be renewed pressure to clarify its own policy, and put a timeline on Trump’s suspension like Facebook.