The Polarizing Effect Of Social Media Algorithms
Some characteristics of "snowflake" culture include among many others, strict adherence to "cancel culture" which is withdrawing support to any figure who is viewed as "problematic" from actions both past and present. Another characteristic of "snowflake" culture is the passing of judgment in kangaroo court style where judgment is passed and the victims are summarily executed, metaphorically of course, with no consideration for either context or their side of the story.
According to the rightists and some centrists, when one minutely scrutinizes "snowflake" culture, it becomes apparent that it is based on feelings of sanctimony. "Snowflakes" always want to be politically correct and show the superiority of their moral standing, behavior which is commonly referred to as "virtue signaling".
Even as a centrist who desists from using labels like "snowflakes", there is no denying that it is next to impossible to have any sort of constructive political discourse on social media platforms like Twitter and Facebook. It is common and actually a given to find the left-leaning and right-leaning groups at each other's throats even on topics in which one would expect some form of commonality in opinions. It is like even in situations where you expect some form of agreement between the two parties, they always find a way to become completely polarized.
I have written in the past about the filter bubble, a state of intellectual isolation incurred by social media users because of the way by which algorithms used to manage the social media networks work. What the filter bubble does is that it creates an "us vs them" mentality between the two polarized groups which further intensifies the hostility between the groups. Is the same way the filter bubble is responsible for far-right extremism and an increase in cases of mental health ailments on social media also responsible for the propagation of "virtue signaling" and "snowflake" culture? The affirmative seems very likely.
The filter bubble is efficient in creating intellectual isolation because humans are social creatures by nature and by design. We also naturally possess a vice referred to as the "confirmation bias" which basically means that as a person, you are more inclined to be favorable and welcoming to opinions and perspectives which are in-line with those you already hold and tend to be hostile to anything else. Social media algorithms alleviate our confirmation bias by making us relate to only people who hold the same beliefs as our own. This happens by the algorithms "suggesting" you to follow people and topics which are in-line with what you already believe in or like, consequently making you hostile to anything else.
Examples of confirmation bias |
Applying this to the context of social media means that users who are in a filter bubble will always want to conform to the views of the collective bubble even if they might have a differing opinion. In discussions on politics, it might mean that even when a rightist agrees with some leftist view, they will ignore that and instead act on the impulses of the group which is disagreeing with any and every leftist sentiment. The opposite would also hold whereby a "snowflake" might not necessarily be offended by something but because the bubble they are in on whatever social network is offended or "virtue signaling", they too also come out as offended to "fit in".
The only way social media users can ensure that they do not fall victim to the polarizing effect of social media algorithms is by them themselves being as open-minded as possible. Follow and interact with topics and people whose views are different from your own. Being attentive to differing opinions does not mean that you have to adopt them but it does gives you more knowledge about other people's perspectives and fosters feelings of empathy. Like the author Mark Manson once wrote, "it is the mark of an educated person to be able to entertain an idea that differs from their own".
The constant squabbling between opposing political views is just one example of the polarizing effect of social media algorithms. Social media companies, however, do not care much about educating their users about this polarizing effect of their algorithms because the more hostile people are to each other, the more activity there is and the more money they make in the process. The onus hence falls on the users to break the cycle by understanding that they cannot develop empathy by always looking at issues from one perspective and then ensuring that there is some sort of diversity in the social media content they consume be it politics, sports, etc.
Comments
Post a Comment