Many concerns have been raised regarding the potential political, religious, and idealistic tunnel vision that social media could create. Facebook uses an algorithm that shows the posts and content with which you’re most likely to engage. Logically, you would think this would equate to the creation of a vortex of your own opinions, only to facilitate making them stronger.
However, a recently published study by ScienceMag claims to rebut that claim. The study was conducted by capturing data from over 10 million Facebook users, all of which had published their political views within their profile details. The researchers monitored the content posted by their friends, what ended up in the users’ Newsfeeds, and what links were clicked to identify whether tunnel vision was an actual problem.
Surprisingly, Facebook users had an average of 23% of friends with opposing political views. They found an average of 29% of stories coming across Newsfeeds had opposing political content and 25% of the links clicked were of opposing political views. Interestingly enough, the study found that the primary driving factor behind viewing cross-cutting content is based on user choice as opposed to Facebook’s ranking algorithm. Facebook’s ranking algorithm only displayed a decrease of 1% in politically opposing content displayed in the Newsfeed. Users only clicked on 6% (Liberals) to 17% (Conservatives) of the opposing articles delivered into their Newsfeeds. Thus, articles are still being displayed, however users are primarily clicking on articles that agree with their political affiliations.
While the study seems promising at first blush, Christian Sanvig with Social Media Collective identified some issues/limitations with the study. One of the primary shortcomings pointed out was that the study only followed Facebook users who had self-identified their political viewpoints. This apparently only encompasses 4% of Facebook’s overall user-base. As a result, Sanvig labels the study as mis-leading:
It turns out that only 9% of Facebook users do that. Of those that report an affiliation, only 46% reported an affiliation in a way that was “interpretable.” That means this is a study about the 4% of Facebook users unusual enough to want to tell people their political affiliation on the profile page. That is a rare behavior.
With such rarity in behavior, it’s challenging not to take this study with a bucket of salt. (i.e. Quite a few grains.) Individuals who self-proclaim their political affiliations are more likely to engage in political debates. Such engagement with cross-cutting content posted by friends would only increase the amount of cross-cutting content displayed through Facebook’s algorithm. Likes, clicks, and comments are the primary driving factors behind what Facebook chooses to place in your Newsfeed. Thus, a user-base of politically active users engaging in a variety of political conversations does not paint a clear picture of the actual impact on the typical users. For this study to be trustworthy, researchers need a large test group and an effective way to identify political affiliation without limiting the study to users with self-proclaimed affiliations.
What are your thoughts on the study? Do you think it is mis-leading? Let us know in the comments below.