Facebook on Thursday announced a new enforcement policy for groups that coordinate online to spread misinformation, hate and “social harm” but do not violate traditional company standards against “inauthentic” content.

Facebook immediately used its new policy against “coordinated social harm” on Thursday to shut down large portions of a German online network pushing the Querdenken conspiracy theory, which has fueled resistance to government health restrictions related to the COVID-19 pandemic.

Thursday’s action moves Facebook beyond its long-standing reliance on “inauthenticity” as the key marker of forbidden behavior on the platform.

The company typically uses the term — which has been widely adopted within the industry — to describe deceptive behavior, in which social media users attempt to manipulate others while disguising their identities and actual views.

Most Facebook takedowns of disinformation operations in recent years — both by foreign actors and domestic ones — relied on Facebook’s designation of a group as engaging in “coordinated, inauthentic behavior,” a term so commonly used that the company often referred to it by the acronym “CIB.”

But the term long has been problematic because companies struggled in many cases to determine who users were and what they believed. Existing prohibitions against hate speech, harassment and incitements to violence already allowed Facebook to act against individual accounts that violated such policies. Facebook also had a policy as well that called for sanctions against “dangerous organizations,” a designation typically applied to extremist groups that foment violence.


Company officials said Thursday they needed a new policy to take action against movements that intentionally caused social harm — including violence — but didn’t rise to the designation of “dangerous organization.”

“We recognize that, in some cases, these content violations are perpetrated by a tightly organized group, working together to amplify their members’ harmful behavior and repeatedly violate our content policies,” said Nathaniel Gleicher, Facebook’s head of security policy, in a blog post announcing the change. “In these cases, the potential for harm caused by the totality of the network’s activity far exceeds the impact of each individual post or account.”

Gleicher said the new policy allows Facebook to more easily act against the “core network” of a group that commits widespread violations.

Facebook officials said the German Querdenken group, whose name translates as “lateral thinking,” has used duplicate accounts and other coordination techniques to spread COVID misinformation, hate speech and incitements to violence on a broad enough scale that it merited systemic enforcement action, though it stopped short of banning the group outright. Facebook did not say how many accounts and pages it removed but said the number was “relatively small” — less than 150 on both Facebook and its subsidiary Instagram.