A light regulatory touch in the 1990s enabled the internet’s tremendous growth.

But given the dominance of a few giant platforms, their failures to adequately self-regulate and the harm they’re causing, it’s time to revisit laws giving them special advantages. That includes Section 230 of the Communications Decency Act of 1996, which shields online services from civil liability for content others post on their sites.

Congress is now considering several changes to Section 230. It should be updated in a way that provides a more level playing field for the media. For dominant platforms such as Google and Facebook, it is in effect a federal subsidy that they no longer need. Section 230 updates could also encourage more careful handling of news and public information by hyperscale media platforms.

Another option is collecting fees from dominant information gatekeepers, so they compensate the public for benefits they receive from Section 230. Proceeds could be used to sustain the free press these companies are suffocating, depriving much of the United States of independent, local news reporting.

Saving the free press should be part of any discussion of revising Section 230 — that’s never been more clear than in the last few weeks of this unfolding pandemic. Local journalism, in communities still lucky to have robust local newsrooms, especially is proving its necessity.

Such revisions should also be part of broader regulatory reforms to address unfair competition and excessive concentration in the media industry.


Section 230 gives special immunity to online services. They can’t be held liable for content posted on their sites, with some exceptions, such as intellectual-property and federal criminal violations. This immunity prevents Facebook from getting sued for defamatory posts, for instance.

A newspaper must spend time and resources to avoid being sued for a story it reports and what it publishes. Online platforms may immediately republish that story, choose how it’s distributed to their audiences and profit from it, without compensating the newspaper or bearing liability for the content.

Section 230 is tricky. Updates must be done without sacrificing good things it has done to foster innovation and dialogue. Section 230 changes are also unlikely to address some major concerns about tech giants, such as failures to block propaganda and follow political-advertising rules.

Others are concerned about bias in platforms’ systems for selecting and promoting stories. But these are private properties, free to moderate as they choose. Government regulation of bias is problematic and likely unconstitutional.

An Orwellian proposal last year by U.S. Sen. Josh Hawley, R-Missouri, would withhold liability protections until federal regulators determine the platforms are “politically neutral.” That came after concerns on the left and the right about Facebook’s news curation.

A similar cudgel is used in a new proposal from Sens. Lindsey Graham, R-S.C., and Richard Blumenthal, D-Conn., purporting to address child exploitation. It also creates an oversight regime, threatening to revoke Section 230 protections from companies that don’t follow “best practices.” Privacy advocates and Sen. Ron Wyden, an Oregon Democrat who helped draft Section 230, say it’s a Trojan horse that will help government circumvent encryption and control online speech.


Another option to directly address Section 230 problems might be to fix its outdated definitions. It differentiates between “interactive computer service” and “information content provider,” providing immunity only to the former.

Anyone posting on Facebook is the “content provider” and may be held liable for it. Facebook is not liable because Section 230 immunizes “services.”

Now that a few giant companies provide most content consumed by Americans, that distinction is obsolete. The largest telecommunications companies are also media companies after mergers since 1996.

Nor are services like Facebook passive pipelines. They decide how content is presented and amplified, and most of their revenue comes from monetizing content, not connectivity.

Section 230 was written partly so internet services had legal cover to moderate content — preventing harmful content from reaching children, for instance.

Decades later, the largest services still get that immunity but fall short protecting users from harmful material. Russia’s success using platforms to sow discord and interfere with the 2016 elections is the prime example.

Giant platforms also weaken democracy by bleeding the free press without investing back into journalism that informs voters and holds government accountable.

While the light touch approach of Section 230 helped the internet blossom, it’s now providing unfair advantage to a few dominant companies. Fix the law so it works better for everyone.