Almost 29 percent of the stories displayed by Facebook’s News Feed present views that conflict with an individual’s ideology.

Share story

For years, political scientists and other social theorists have fretted about the Internet’s potential to flatten and polarize democratic discourse. Because so much information now comes through digital engines shaped by our own preferences — Facebook, Google and others suggest content based on what consumers previously enjoyed — scholars have theorized that people are building an online echo chamber of their own views.

But in a peer-reviewed study published Thursday in the journal Science, data scientists at Facebook report that the echo chamber is not as insular as many might fear, at least not on the social network. While independent researchers said the study was important for its scope and size, they noted several significant limitations.

After analyzing how more than 10 million of the most partisan users of the social network navigated the site during six months last year, researchers found that people’s networks of friends and the stories they see are skewed toward their ideological preferences. But that effect is more limited than the worst case that some theorists had predicted, in which people would see almost no information from the other side.

On average, about 23 percent of users’ friends are of an opposing political affiliation, according to the study. An average of almost 29 percent of the news stories displayed by Facebook’s News Feed also appear to present views that conflict with the user’s ideology.

In addition, researchers found individuals’ choices about which stories to click on had a larger effect than Facebook’s filtering mechanism in determining whether people encountered news that conflicted with their professed ideology.

“This is the first time we’ve been able to quantify these effects,” Eytan Bakshy, a data scientist at Facebook who led the study, said in an interview. “You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that’s not the case here.”

Facebook’s findings run counter to a longstanding worry about the potential for digital filtering systems to shape our world. For Facebook, the focus is on the algorithm the company uses to determine which posts people see, and which they do not, in its News Feed.

Cass Sunstein, the Harvard law professor and President Obama’s former “regulatory czar,” worried that such recommendation engines would lead to a tailored version of news and entertainment that might be called “The Daily Me.” Eli Pariser, chief executive of Upworthy and a former director at MoveOn.org, labeled it the “Filter Bubble.” Some Facebook users have said they unfollow friends and acquaintances who post content with which they disagree.

“This shows that the effects that I wrote about exist and are significant, but they’re smaller than I would have guessed,” Pariser said in an interview about Facebook’s study.

Natalie Jomini Stroud, a professor of communications studies at the University of Texas, Austin, who was not involved in the study, said the results were “an important corrective” to the conventional wisdom.

“There’s been so much hype about the algorithm and how it might be constraining what people are viewing,” she said.

The study adds to others that debate whether the Internet creates an echo chamber. A Pew Research Center report last year found that media outlets people name as their prime information sources about politics and news are strongly correlated with their political views. Another study late last year published as a working paper in the National Bureau of Economic Research analyzed Twitter usage during the 2012 election and found social media often exposed users only to opinions that match their own.

Stroud and several other researchers note that the Facebook study has limitations. All of the users studied were of a type: those who have self-identified as liberal or conservative in their profiles. Most Facebook users do not post their political views, and Stroud cautioned that those users might be either more or less accepting of conflicting political views.

The findings are convenient for Facebook. With more than 1.3 billion users, the social network is effectively the world’s most widely read daily newspaper. About 30 percent of U.S. adults get their news from the social network, according to the Pew Research Center. But its editorial decisions are drafted in a black box, with the company’s opaque News Feed algorithm deciding which of your friends’ posts you see, which you don’t and in what order. Facebook could use the study’s results to show that its secret algorithm is not ruining national discourse.

Facebook said its researchers were allowed wide latitude to pursue their research interests and to present whatever they found.

In addition to Bakshy, Facebook data scientists Solomon Messing and Lada Adamic worked on the study.