Monday, August 1, 2016

Thoughts on Exposure to ideologically diverse news and opinion on Facebook


Summary

The ranking algorithm on the social media is always a controversial issue across disciplines researchers. For instance, the famous debate of filter bubble and echo chamber effect. The state-of-the-art data mining and machine learning techniques, actually, enforcing the phenomena. Due the Facebook ranking algorithm become smarter and smarter, your post wall is filled with all the content you preferred, lack of diversity or multiple voices. Even worse, manipulated by some of the commercial or private purpose.

In this paper that Facebook published in Science magazine, it is the first time to response this issue based on the real world massive data set. The finding are: 1) the stronger ideological alignment will come with higher share numbers. In other words, the article with strong perspective would be re-post more from users, and not a surprise, by the same alignment of users (i.e. Liberal users will tend to re-post more liberal articles and vise versa); 2) the homophily of the friends. The users with the similar ideological affiliation will tend to friend each other on Facebook. The data analysis reflects a clear pattern that the liberal and conservative both with less friend ties of different ideological affiliation, in other words, less diversity. 3) The crosscutting percentage is dropping when the content explosure decreasing. More specifically, when the user can randomly browse all the content on Facebook post, there are around 40-45% of crosscutting rate. However, the rate dropping dramatically, when the user selects from within friend circle, algorithmic suggestion or by themselves. More interestingly, the paper makes a conclusion that, the lower diversity reads/share behavior is mainly due to the individuals' choices.

This is valuable research due to this is the first time to reveal the detail pattern from the Facebook real-world data set. However, I against the conclusion they made, also thinking about the other potential research topic that we can pursue. Here are my reasons: 1) to account the responsibility for the user is not fair due to most of the users, they, have no clue about how the algorithm behind the system will affect their future information consumption. For example, the Facebook ranking algorithm will penalty the ranking score if you not to "like" or "share" the content you saw, say, the news articles. Hence, the news article you ignore will, slowly, disappear on your wall post. And, the mechanism is not transparent at all. The user will never know some of the content they are pre-filtered, due to some of the ignorance action they did. I argue if the "ignore" or "don't like" represent the preference of dislike for each user? 2) There is no way for the user to understand or join in the loop of algorithice processing. The user is basically followed the system suggestion or guidelines, in a very "user-friendly" and "simple" way. There should be either a way to explain or "undo" function let user can maintain the diversity content consumption, in their own will. Also, a double reminder when you decide to unfollow or dislike something, does a reminder interface require, just like when you decide to permanently delete a file from your computer? 3) The user deserves the controllability. Why there is only one personalized ranking algorithm for all different kinds of users? I think the user has the right to choose the preference they like, rather than, decided by some unknown experts or machine learning algorithms. I think this would be fair to claim, the less diverse is due to the click behavior by the users.

The three points above are the potential research topics in my point of view. If we think about Facebook ranking in a recommender system, the same discrimination or less diverse issue may also happen just under our nose. Furthermore, the potential conflict of interest the researcher from industry should be revealed. I admin the researcher from big companies with more resources to answer some of the social phenomena than a laboratory environment. For example, the Google trend for flu prediction and the Facebook ideologically diverse across real-world system users. However, the commercial companies are responsible to their stock holder, not to publics. I believe this should be the advantage for researcher in academia to play a natural role on the research subjects. This is also the value to establish a small scale system and seek for controlling experiments.


  1. Bakshy, Eytan, Solomon Messing, and Lada A. Adamic. "Exposure to ideologically diverse news and opinion on Facebook." Science 348.6239 (2015): 1130-1132. APA

No comments:

Post a Comment