For years, political scientists and other social theorists have fretted about the Internet’s potential to flatten and polarize democratic discourse. Because so much information now comes through digital engines shaped by our own preferences — Facebook, Google and others suggest content based on what consumers previously enjoyed — scholars have theorized that people are building an online echo chamber of their own views.
But in a peer-reviewed study published on Thursday in the journal Science, data scientists at Facebook report that the echo chamber is not as insular as many might fear — at least not on the social network. While independent researchers said the study was important for its scope and size, they noted several significant limitations.
After analyzing how more than 10 million of the most partisan users of the social network navigated the site over a six-month period last year, researchers found that people’s networks of friends and the stories they see are in fact skewed toward their ideological preferences. But that effect is more limited than the worst case that some theorists had predicted, in which people would see almost no information from the other side.
On average, about 23 percent of users’ friends are of an opposing political affiliation, according to the study. An average of almost 29 percent of the news stories displayed by Facebook’s News Feed also appear to present views that conflict with the user’s own ideology.
In addition, researchers found individuals’ choices about which stories to click on had a larger effect than Facebook’s filtering mechanism in determining whether people encountered news that conflicted with their professed ideology.
“This is the first time we’ve been able to quantify these effects,” Eytan Bakshy, a data scientist at Facebook who led the study, said in an interview. “You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that’s not the case here.”
Facebook’s findings run counter to a longstanding worry about the potential for digital filtering systems to shape our world. For Facebook, the focus is on the algorithm that the company uses to determine which posts people see, and which they do not, in its News Feed.
Cass R. Sunstein, the Harvard law professor and President Obama’s former “regulatory czar,” worried that such recommendation engines would lead to a tailored version of news and entertainment that might be called “The Daily Me.” Eli Pariser, chief executive of Upworthy and a former director at MoveOn.org, labeled it the “Filter Bubble.” Some Facebook users have said they unfollow friends and acquaintances who post content with which they disagree.
And with political discussions becoming increasingly pitched in the run-up to a presidential election next year in which the Internet will be used as a primary campaign tool, the problem appeared to be getting worse.
“This shows that the effects that I wrote about exist and are significant, but they’re smaller than I would have guessed,” said Mr. Pariser in an interview about Facebook’s study.
Natalie Jomini Stroud, a professor of communications studies at the University of Texas at Austin, who was not involved in the study, said the results were “an important corrective” to the conventional wisdom. “There’s been so much hype about the algorithm and how it might be constraining what people are viewing,” she said.
The study adds to others that debate whether the Internet creates an echo chamber. A Pew Research Center report last year found that media outlets people name as their prime information sources about politics and news are strongly correlated with their political views. Another study late last year published as a working paper in the National Bureau of Economic Research analyzed Twitter usage during the 2012 election and found social media often exposed users only to opinions that match their own.
Dr. Stroud and several other researchers note that the Facebook study has limitations. All of the users studied were of a type: those who have self-identified as liberal or conservative in their profiles. Most of Facebook’s users do not post their political views, and Dr. Stroud cautioned that those users might be either more or less accepting of conflicting political views.
The findings are convenient for Facebook. With more than 1.3 billion users, the social network is effectively the world’s most widely read daily newspaper. About 30 percent of American adults get their news from the social network, according to the Pew Research Center. But its editorial decisions are drafted in a black box, with the company’s opaque News Feed algorithm deciding which of your friends’ posts you see, which you don’t and in what order. Facebook could use the study’s results to show that its secret algorithm is not ruining national discourse.
Facebook said its researchers were allowed wide latitude to pursue their research interests and to present whatever they found.
Facebook also noted that this study was substantively different from one that caused an outcry last year, in which the company’s scientists altered the number of positive and negative posts that some people saw to examine the effects on their mood. This study did not involve an experiment that changed users’ experience of Facebook; researchers analyzed how people use Facebook as it stands today.