Facebook strategic partner manager, news Meghan Peters introduced the study in a post in the News, Media & Publishing on Facebook group, writing:
As people’s exposure to news increases on social networks, so do concerns about the creation of echo chambers. That’s why we looked at how social networks, algorithmic ranking and individuals’ choices affect the diversity of content that people in the U.S. encounter on Facebook.
Our findings show that people are exposed to a substantial amount of content from friends with opposing viewpoints. When it comes to News Feed, 28.9 percent of the hard news encountered cuts across ideological lines. It’s the composition of our social networks and individual choice that most affect the mix of content we see.
More details follow from the report, which is embedded below:
We found that people have friends who claim an opposing political ideology, and that the content in peoples’ News Feeds reflects those diverse views. While News Feed surfaces content that is slightly more aligned with an individual’s own ideology (based on that person’s actions on Facebook), who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much content they encounter that cuts across ideological lines.
- On average, 23 percent of people’s friends claim an opposing political ideology.
- Of the hard news content that people’s friends share, 29.5 percent of it cuts across ideological lines.
- When it comes to what people see in the News Feed, 28.9 percent of the hard news encountered cuts across ideological lines, on average.
- 24.9 percent of the hard news content that people actually clicked on was cross-cutting.
This is the first time we’ve been able to quantify these effects. You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that’s not the case here.
Readers: What did you think of Facebook’s findings?
Image courtesy of Shutterstock.