Facebook Manipulated Our Emotions, And We Didn’t Even Know About It!

Posted on July 14, 2014 in Specials

By Nishant Chhinkwani:

Friends, Indians, netizens – lend me your ears. I have come here to praise Facebook, not to slander. Since the time Facebook was incepted, like a dream within a dream within a dream (remember the idea originally belonged to Divya Narayan, who later brought in, the 6 feet 4inch massive buffed up peas in a pod, Winklevoss twins, who then brought in Mark Zuckerberg as the programmer, who wasted no time in snatching away that potential billion dollar idea and using it as his own; today that idea has a market cap value of $170 billion dollars.), it has captured the imagination of the young user and the old, taking one’s entire life experience and putting it online for others to see. It took over Harvard by a storm, where it played its exclusivity card oh so cunningly, alluring students to beg their friends to invite them to sign up. Soon, it spread like the plague in schools over North America and Europe, and everyone wanted to be infected. Asia and Latin America did not offer much of a resistance and pretty soon, Facebook was everywhere. Today it has a staggering 1.28 billion monthly users and it watches the activities of us willing fools like a towering Big Brother; always behind the scenes, but eerily omnipresent.

facebook manipulated newsfeed

But Facebook is a social ‘best’, it helps connect over a billion people on a common platform with so much to choose from and so much to gain!

One out six people in the world use Facebook. An enviably vast gene pool source to choose from; a social researcher’s veritable wet dream.

If you thought only the members of the opposite sex (or of the same sex or whoever your preference might be) could play with your emotions, think again.

Recently, it has come to light that Facebook has been manipulating the newsfeeds of over half a million randomly chosen users, changing the number of positive and negative posts that they saw. All this was a part of a psychological ‘academic’ study to examine the spread of emotions on social media. All of this without the proper admissible consent of the user, who had no clue whatsoever that his/her newsfeed was being manipulated to get an idea about his/her emotional response.

But Facebook is a social ‘best’, it helps connect over a billion people on a common platform with so much to choose from and so much to gain!

Facebook has defended itself saying that its terms of use state that it can ‘modify’ the newsfeed of its users. It says that these tests were carried out to understand the emotional responses of people on social media to help them understand the users in a better way and improve the quality of the service it provides.

Improve the quality of its service at the cost of manipulating someone’s mood by incessantly bombarding their newsfeed with tailored posts of extreme happiness, sadness or listlessness just because it can?

Legally- Borderline correct.

Ethically- Completely wrong.

But Facebook is a social ‘best’, it helps connect over a billion people on a common platform with so much to choose from and so much to gain!

I’ll be honest. I am hooked on to Facebook. You could even say I’m addicted to social media and I live a substantial part of my life on it. I have trusted Facebook with my personal information, photographs, and memories, emotional responses only to find out that I may have been manipulated by them and it might be responsible for those horrible days where my newsfeed completely ruined my mood and neutered my creativity, without my consent and my permission. It is a subtle, but sure way of violation of human rights. Even scarier is the fact that Facebook has defended itself and will probably get away with it. Because, the user is still going to continue using Facebook.

Because, Facebook is a social ‘best’, it helps connect over a billion people on a common platform with so much to choose from and so much to gain!

Youth Ki Awaaz is an open platform where anybody can publish. This post does not necessarily represent the platform's views and opinions.