Does content curation by Facebook introduce ideological bias? by David Lazer, Published Online May 7 2015 Science DOI: 10.1126/science.aab1422: “Humanity is in the early stages of the rise of social algorithms: programs that size us up, evaluate what we want, and provide a customized experience. This quiet but epic paradigm shift is fraught with social and policy implications. The evolution of Google exemplifies this shift. It began as a simple deterministic ranking system based on the linkage structure among websites—the model of algorithmic Fordism, where any color was fine as long as it was black. The current Google is a very different product, personalizing results on the basis of information about past searches and other contextual information, like location. In this week’s Science Express, Bakshy et al. explore whether such personalized curation on Facebook prevents users from accessing posts presenting conflicting political views. The rise of the social algorithm is rather less transparent than the post–Model T choice in automobiles. Today’s social algorithms are so complex that no single person can fully understand them. It is illustrative in this regard to consider that Bakshy et al. are Facebook researchers studying the impact of Facebook algorithms. You might imagine that they could just go into the next building and look directly at the code. However, looking at the algorithms will not yield much insight, because the interplay of social algorithms and behaviors yields patterns that are fundamentally emergent. These patterns cannot be gleaned from reading code. Social algorithms are often quite helpful; when searching for pizza in Peoria, it helps not to get results about Famous Ray’s in Manhattan. However, personalization might not be so benign in other contexts, raising questions about equity, justice, and democracy. Bakshy et al. focus on the last, asking whether the curation of news feeds by Facebook undermines the role that Facebook plays as a forum for public deliberation. For the Facebook-uninitiated, much of the activity of Facebook is in the form of news that users post to their feed, which their friends have some access to and can like and comment on. When you open Facebook, you see a list of recent posts by friends; however, you typically will not see all posts, which are algorithmically sorted. The rationale for such curation is that in its absence, users would be deluged by uninteresting content from their friends. Facebook tries to pick out the gems from the detritus, anticipating what you will like and click on. But what are we missing? And are these computational choices troubling?…”
- See also New York Times – Facebook Study Disputes Theory of Political Polarization Among Users By FARHAD MANJOO. “Almost 29 percent of the news stories displayed by Facebook’s News Feed present views that conflict with the user’s own ideology, the study found.”
Sorry, comments are closed for this post.