A few years ago, frustrated by Facebook’s terrible decision-making, I called repeatedly for the social media behemoth to get itself an executive editor.

This idea, granted, was impractical. How could a single person, no matter how skilled and dedicated, stay on top of the content constantly generated by billions of users in hundreds of languages worldwide?

Still, it seemed clear that Facebook needed some intelligent and powerful judgment at the top, because so many truly awful things were happening there.

A complete lie about Pope Francis endorsing President Donald Trump for president went viral, read by countless millions of users. Meanwhile, some misguided censor removed a famous, Pulitzer Prize-winning photo about the brutality of war after assessing it to be child pornography.

And you might recall the ugly business of the Cambridge Analytica data breach that allowed millions of Facebook users’ personal data to be harvested without their consent and used for targeting political advertising.

Somehow, I didn’t think that founder Mark Zuckerberg was up to the task.


Remember, for instance, this blithe judgment of his about the 2016 presidential election: “Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea.”

But conventional wisdom now holds that without Facebook’s help spreading misinformation, Trump probably would not be in office today.

Facebook never did appoint a top editor. It just kept growing and raking in the bucks: These days, the social network founded in Zuckerberg’s Harvard dorm room in 2004 is worth more than $500 billion and has nearly 3 billion users.

But now, the company has come up with something intended to bring wisdom and judgment to the rescue: Facebook recently announced the formation of a 20-member “oversight board.” The panel will rule on difficult content issues, such as whether specific Facebook or Instagram posts constitute hate speech. Some of its rulings will be binding; other will be considered “guidance.”

It’s being called Facebook’s Supreme Court. And its members certainly are an impressive group. There’s a former Nobel Peace Prize laureate (Tawakkul Karman of Yemen). The former prime minister of Denmark (Helle Thorning-Schmidt). And the former editor-in-chief of the Guardian newspaper in the United Kingdom (Alan Rusbridger).

The rest of them aren’t too shabby either.

If gold-plated résumés were the answer, we’d be all set. But as one of the top technology critics in the country, Recode co-founder Kara Swisher, put it recently, they’ve been charged with the impossible — “trying to push back the ocean with one hand.”


To this, I’d add some other concerns. Anyone who has ever served on a committee, especially a large one or one populated with big egos, knows that it’s not an ideal way to get things done.

With rare exceptions, the committee format is unwieldy and inefficient, long on lofty discussions, short on definitive action. And certainly not a proven way to cut through a vast amount of information, take on the thorniest of problems, and make hugely important decisions on issues that constantly arise in real time.

Would the board, with all its powers and devotion to transparency, have been able to do anything about the spread of the dangerously misleading and conspiracy-mongering “Plandemic” film that flourished wildly on Facebook in recent days?

I doubt it. The group’s decisions on content are binding, but they are given 90 days to make decisions, and as one Facebook official, Brent Harris, put it: “The purpose of the board isn’t necessarily to deal with rapid viral issues, but complex challenging content issues that have wide-ranging impact.”

Worse still, the whole effort may serve a troubling purpose: to cover up the need for meaningful reform with a high-priced fig leaf.

Facebook’s efforts to avoid profit-hindering regulation — not to mention, heaven forbid, a corporate breakup — are well-known. Now they can point to this impressive board and say, “See, we fixed it!


“Self-regulation is an excellent way to appear to promote particular values and keep scrutiny and regulation to a minimum,” University of Virginia media studies professor Siva Vaidhyanathan observed in a blistering Wired piece.

He noted, too, that because the board’s mandate is so limited, it “can’t say anything about the toxic content that Facebook allows and promotes on the site. It will have no authority over advertising or the massive surveillance that makes Facebook ads so valuable,” nor will it “curb disinformation campaigns or dangerous conspiracies.”

What’s more, the members’ paid participation may actually end up muting their voices at a time when they could be serving as some of Facebook’s most influential critics.

“They are now effectively within the Facebook corporate tent,” Emily Bell, director of Columbia University’s Tow Center for Digital Journalism told Columbia Journalism Review. That “buys up potential dissent or criticism.”

Facebook hasn’t said how much board members are paid, but it has created a $130 million to fund the effort, possibly after finding precisely that amount in the couch cushions at the company’s Menlo Park headquarters. But despite the glittery credentials, the presumably big bucks and the sweeping powers, I can’t imagine that the oversight board ends up making much of a difference in the toxic stew that Facebook cooks up daily and serves to the world.

Maybe an executive editor isn’t such a bad idea, after all.