The study published in the journal Science, based on an analysis of 10 million Facebook users and seven million web links, found many of the shared stories allowed people to get viewpoints different from their own.

The findings appeared to minimize concerns in some quarters that social networks are leading to political polarization by grouping people along ideological lines and not exposing them to opposing views.

The research, which was led by data scientists from Facebook and the University of Michigan, contained numerous caveats about identifying the ideological bent of users and political leanings inherent in news stories, but nonetheless suggested that fears of a Facebook information bubble were overblown.

Facebook has been under special scrutiny because it uses algorithms that aim to deliver relevant stories to each user based on their interests.

In a statement, Facebook said the study highlights that its users are getting a variety of viewpoints.

"We found that most people have friends who claim an opposing political ideology, and that the content in people's News Feeds reflect those diverse views," the social network said.

"News Feed surfaces content that is slightly more aligned with an individual's own ideology, however the friends you choose and the content you click on are more important factors than News Feed ranking in terms of how much content you encounter that cuts across ideological line."