It’s time to treat the social-media platforms — Facebook, Google, YouTube and others — as exactly what they are: news organizations responsible for the content on their sites. They’re publishers just like newspapers and broadcasters, just like radio and TV networks.

As infectious hate speech grows, they’ve spent the last decade fiercely defending so-called privacy policies and social media’s role in connecting the world.

The refrain is this: “We’re not media, we don’t control the content of what our users put up” (except at least recently with bans on some hate and conspiracy peddlers like Infowars). “We’re just a ‘platform’ ” for other people’s content.

But they’re not just “platforms.” They are businesses that sell ads, the more ads, the more income — revenue in the billions annually. And, of course, the companies that buy ads on Facebook and YouTube want lots and lots of people to see them.

As a result, the platform companies are always looking for ways to keep your eyes on their pages. For that, they have a handy and wonderful tool: algorithms, computer programs that react to what users are clicking on and suggest more pages to look at. They want to keep you hooked. It’s essential to their “business model.”

And here’s where it gets interesting:

By suggesting other sites, providing links to other pages this or that user might be attracted to, providing links as photos or text teasers right there for you to click on, the so-called “platforms” have crossed the line. They presented the user with new content. Yes, you still have to click, but the platforms deliberately and creatively handed it to you. With that action, they have become publishers, no different from newspapers. For that reason, we can and should subject social-media platforms to all the regulation and responsibilities we require and expect from newspaper and magazine publishers, and radio and television networks.

Advertising

I’m betting that the real possibility that Congress and the courts — even a few state attorneys general — will for this reason start seeing them as publishers keeps Facebook’s Mark Zuckerberg and YouTube CEO Susan Wojcicki awake at night.

There is no doubt that the algorithms create (publish) content for readers in the same way readers find new things when they turn the pages of their newspapers or tune in to the next Fox or CNN panel. And the platforms’ algorithms have proved powerful for a very disturbing reason. They work best — keeping users engaged and seeing more ads — when the links they offer appeal to strong emotions. Anger, hate, fear. Love is not on the list.

On Aug. 13, The New York Times ran a story by reporters Max Fisher and Amanda Taub showing how the YouTube algorithm at work in Brazil continually recommended additional far-right videos once a viewer looked for — or just stumbled on — any videos extolling right-wing views. The use of WhatsApp, owned by Facebook, and YouTube recommendations so saturated political discourse in Brazil that the site appears to have played a role in the swing of Brazilian politics rightward.

In Brazil, “The system now drives 70 percent of total time on the platform, the company [YouTube] says,” Fisher and Taub reported.

A similar effect by Facebook was found affecting anti-Muslim demonstrations in a couple of German towns last year. Heavier Facebook users found themselves in a world of anti-immigrant posts.

Do you have something to say?

Share your opinion by sending a Letter to the Editor. Email letters@seattletimes.com and please include your full name, address and telephone number for verification only. Letters are limited to 200 words.

Faced with anything like government oversight or regulation, we know the tech companies will beat the First Amendment drum. But that’s not at issue. As publishers, they can choose to continue doing exactly what they do now, letting their algorithms lead many users into worlds of falsehood and conspiracy risking the lawsuits and backlash facing any real publisher. Or they can pull back, ceasing to be publishers by letting only those links appear that were included by the individuals or companies that create posts on their platforms.

It’s not a free-speech issue. It’s just that the social-media companies have to stop lying to us about what they are. Neutral platforms? No. Their use of algorithms to lead to new pages is publishing, just like a newspaper.