Домой United States USA — mix Studies Keep Finding That Social Media Algorithms Don't Increase Polarization. Why Is...

Studies Keep Finding That Social Media Algorithms Don't Increase Polarization. Why Is the Press So Skeptical?

59
0
ПОДЕЛИТЬСЯ

New research on Facebook before the 2020 election finds scant evidence to suggest algorithms are shifting our political views.
New research looking at Facebook in the run-up to the 2020 election finds scant evidence to suggest that social media algorithms are to blame for political polarization, extremism, or belief in misinformation. The findings are part of a project in which Meta opened its internal data to academic researchers. The results of this collaboration will be publicized in 16 papers, the first four of which were just published in the journals Science and Nature.
One of the studies found that switching users from an algorithmic feed to a reverse chronological feed—something suggested by many social media foes as the responsible thing to do—actually led to users seeing more political content and more potential misinformation. The change did lead to seeing less content «classified as uncivil or containing slur words» and more content «from moderate friends.» But none of these shifts made a significant difference in terms of users’ political knowledge, attitudes, or polarization levels.
«Algorithms are extremely influential in terms of…shaping their on-platform experience,» researcher Joshua Tucker, co-director of the Center for Social Media and Politics at New York University, told The . Despite this, «we find very little impact in changes to people’s attitudes about politics and even people’s self-reported participation around politics.»
Another of the experiments involved limiting re-shared content in some users’ feeds. Reshares—a measure of social media virility—»is a key feature of social platforms that could plausibly drive» political polarization and political knowledge, the researchers suggest. Users who saw no reshared content for three months did wind up having less news knowledge, as well as lower engagement with the platform and less exposure to «untrustworthy content.» But it did not make a difference in political attitudes or polarization levels.
Nor did increasing users’ exposure to ideologically diverse views—as another of the experiments did—wind up significantly shifting «affective polarization, ideological extremity, candidate evaluations and belief in false claims.»
Taken together, the studies strike a strong blow against the «zombie bite» theory of algorithmic exposure, in which people are passive vessels easily infected by divisive content, fake news, and whatever else social media platforms throw at them.
They’re the latest in a long line of papers and reports casting doubt on the now-conventional wisdom that social media platforms—and particularly their algorithms—are at fault for a range of modern political and cultural problems, from political polarization to extremism to misinformation and much more.

Continue reading...