Site icon Youth Ki Awaaz

How Social Media Has The Potential To Turn You Into A Fanatic

Representational image.

Have you ever realised that the news and articles and content you see on your social media and news suggestions feed a particular belief you hold or agree with? I have recently noticed that I see almost little to no articles or content pieces that challenge my beliefs and I do not like it. I did get enraged at the fact that I’m being manipulated by a social media giant, but then I realized I literally have no control over any of my activities online and I would be angry forever.

There’s no second-guessing why this happens – social media algorithms are at work studying every move we make on the platforms, harnessing data of all kind, right from keeping track of what ads you click on, your login timings, locations you logged in from, stickers you send, what apps you use – the list is long.

Heck, Facebook admitted last year to even tracking mouse movements on a screen to differentiate between bots and humans, along with other information about the device being used to help deliver personalised content. Oh, and they can access your webcam and mic. At this stage, our right to data privacy is as real as India being a Hindu state – it’s a myth. Meanwhile, our data is being distributed like mithai during a festival.

Filter Bubbles

All these reams of data and their harvesting are not in vain, though. They use it to deliver “personalised” content right to your virtual doorstep, i.e. your feed. This activity, known as “algorithmic editing” has the potential to turn you into a closed-minded extremist with zero tolerance for opposing views.

These “filter bubbles” (I love this term – thank you, Eli Pariser) are “intellectual bubbles” which the algorithm can put you in by showing you content which you are most likely to consume and engage with, based on a number of factors such as search history, past click records, etc. This leads to you being isolated from anything that may oppose your point of view, putting you in an ideological bubble, while feeding and nurturing your beliefs.

Filter Bubbles. (Photo via spreadprivacy)

For example, I don’t ever remember seeing any articles that support Trump on my Instagram explore page or Facebook feed, because by now “it” knows I don’t care for the man’s politics, or whatever you want to call what he is up to.

At the same time, I see a lot of posts about the brutal annihilation of forests in India and extreme violation of human rights that takes place in India. I rarely see posts about an actually helpful government scheme that was implemented successfully over the last 5 years. It took me a while to believe it, but there are quite a few.

Here’s why this is a problem (if you haven’t figured it out already). There is a term in psychology known as “confirmation bias,” a cognitive bias that makes us inclined to seek out and favor information that affirms our beliefs and ideologies and hypotheses. It happens a lot, with all of us, in everyday life. People are biased towards their own beliefs, and even ambiguous events or situations and interpreted to fit into and confirm this bias.

No matter what you believe, you will find daily evidence to support your theories to make them stronger. At the same time, any opposition to your views will be labeled as a “special case” or an “exception” and dismissed.

Photo via medium.

Polarisation

The story of what happens on social media is not very different. The content on our feeds mirrors our thoughts and convictions, and we lap it up, all the while being convinced our point of view is the right or worse, the only right one. This turns our digital source of information into an echo chamber, leaving little to no room for our thoughts and opinions to be challenged by alternative ones, known as “disconfirming evidence.”

This leads to polarisation due to two or more groups of people circling around within their own bias circles, with no intention or effort to seek information from the other circle.

Photo via medium.

(Technology such as social media) “lets you go off with like-minded people, so you’re not mixing and sharing and understanding other points of view… It’s super important. It’s turned out to be more of a problem than I, or many others, would have expected.”
— Bill Gates in Quartz, 2017

This is dangerous. Do you know what this means? This means that racists find racists, nationalists find nationalists,  sexists find sexists and casteists find casteists and their beliefs get stronger, with very little in the content they consume to dispute their beliefs. Even though such beliefs may be formed offline, they are fostered online. Phenomena like collective cyber-racism are on the rise, and there is also a school of thought which proposes that social media has led to a rise in hate crime.

What happens if we don’t recognise this and work on it? Long story short, you become a fanatic, or worse, an extremist.

Fake News

This hyper-personalisation of content, paired up with the father of all fallacies, the confirmation bias, leads people to believe absolutely anything that fits into their thought molds, even fake news.

The problem with fake news in India is so pervasive that WhatsApp is being optimised to counter fake news, Facebook is working on a feature that will alert users about fake news and the politicians do… well whatever they do, like spend millions of rupees to spread fake news to win elections.

Most people, especially while scrolling through a news feed, process information fast, the fastest way possible for them. We use shortcuts to do so, without which we wouldn’t be able to get through a single day without losing it, considering the amount of information that is thrown our way each minute.

Some of these shortcuts are biases like the confirmation bias, and using them can lead to “blind spots,” where we can’t rationally and accurately perceive or process information. We don’t even bother to verify the information and buy into it without questioning it. We are less likely to believe information that doesn’t agree with our viewpoints. What even, brain?

Fake News and nationalism. (Photo: statista)

Threat To Democracy

Some rightfully argue that a mixture of algorithmic editing and confirmation bias is a threat to any democracy. They say that it removes alternative and diverse viewpoints, making people unable to make fully informed political decisions.  Although it is also up to the user to block out content they don’t want to see, the narrowing down of content that algorithms offer is a major culprit. Then again, we believe all the lies that politicians tell us to our face, they don’t even need social media for that, so…

What’s The Solution?

There is, however, a way to combat this bias. We’re not all lost causes, not yet. The key is to find disconfirming evidence or opposing beliefs to ours and actually learning more about them. It is very possible that some of our perspectives are not exactly rational. Once the awareness comes, there is a way to do something about it, and do something we must, whether it is our personal biases or political ones.

India currently runs on this fuel of hatred, casteism, communalism, and polarisation. Don’t let social media and your biases make you believe this is okay. Don’t let it convince you that this is all that is happening, either – yes, it works both ways.

Confirmation bias on social media and otherwise also affects your mental health and can lead to negativity, anxiety, and depression, so curate your social media in a way that gives you a realistic world view, not a one-sided one.

Bias begets bias and the circle is never-ending, it has to be broken out of and soon. We don’t need any more fanatics, especially not judges who think peacocks reproduce through tears. I wonder what his social media feed looks like.

Featured image for representative purpose only.
Exit mobile version