While looking for new ideas on how to perform well in a debate, I came across the videos of Jordan Peterson on YouTube. I found interesting the way this Canadian psychologist defends his ideas in different debates on controversial issues, even though I do not necessarily agree with all his ideas. So I watched some of them. YouTube kept suggesting me more and more videos of Peterson “destroying feminists”, “sharing thought on Hitler, religion, transgender people” etc. Judging through the comments, it was obvious to me that his videos are only popular for a special segment of the society. Then the algorithm of YouTube though that I might also be interested in Ben Shapiro, American journalist, fervent advocate for conservative ideas, even Nigel Farage, leader of the Brexit Party.
This episode has nothing astonishing for many of us who everyday discover new things thanks to social media algorithms. I did reflect on its possible adverse effect, namely reinforcing ideological polarization in the society, taking into account that social media plays increasingly important role in the consumption of news and information. Without any intention to enter the scientific debate on this issue, I considered it through two opposing arguments that explain social media’s role in ideological polarization[1].
The first one suggests that the content proposed to users according to their previous behavior puts them into so-called filter bubbles. That makes users more and more exposed to similar interpretations and ideological bias. In my case, the more I watched the suggested videos, the more I received, in a commercial logic, videos with similar content or those liked by users of similar profile.
Another argument consists in supposing that users are more likely to interact with content which confirms their pre-existing views. Users often watch testimonials of people with whom they agree beforehand. Furthermore, liking or sharing a specific content, or following persons became a political act for many users. In the specific case described above, YouTube algorithm certainly suggests me videos with similar content, but it cannot prevent me from looking for alternative content or diversify the information that I consume.
A hasty conclusion from this would be a synthesis of two arguments: social media platforms are not the reason behind the ideological polarization, but they help to deepen this phenomenon. Claiming the opposite would mean that without social media the society would not be polarized, which appears not to be true. For decades, individuals have been deliberately opting for different mass-media outlets close to their political and ideological sensibilities.
The traditional media’s accusations towards social media on its alleged divisive character come periodically to the surface. Facebook seems to acknowledge the issue, thus taking a range of measures to alleviate its effects, by recalibrating the news feed and restricting content recommendation, among others.
[1] See : Dominic Spohr, Fake news and ideological polarization: Filter bubbles and selective exposure on social media, Business Information Review2017, Vol. 34(3) 150–160
Photo: Chris Barbalis
Comments