Facebook released the results of a social experiment study where its data scientists sent the positive or negative emotional content that appeared in the news feeds of nearly 700,000 users over the course of a week. The experiment is facing global outcry for invading the privacy and manipulating people’s emotion.
Facebook did this in order to study their reaction to the positive & Negative content. The study found evidence of “emotional contagion,” in other words, that the emotional content of posts made user’s to take subsequent actions.
Facebook, as we know, is a networking website headquartered in Menlo Park, California. Its name comes from an idiom for the directory given to students at some American universities. Facebook was founded on February 4, 2004, by Mark Zuckerberg with his college roommates and fellow Harvard University students Eduardo Saverin, Andrew McCollum, Dustin Moskovitz and Chris Hughes which Mark later paid off to gain control of the Company.
Facebook is getting criticized world wide for doing the social experiment on people as if they are Rats in the lab. Facebook will be learning the hard way that with great data comes great responsibility& such things should be avoided totally or should not be publicly revealed else they will face an outcry from all users.
What Facebook actually did in the Social experiment – Facebook removed emotional messages for some users. It did not, as many people seem to be assuming, added content specifically intended to induce specific emotions. Some people were shared positive news more like birth, anniversary, happy moments of life while others were fed with the negative events more like death, break up , accident which happened with people in their friend list and groups/pages. Facebook did not change any content they only tweaked the algorithm for 700,000 users so some people get more of positive and some more negative content.
But it’s certainly not credible to suggest that replacing 10% – 90% of emotional content with neutral content constitutes a potentially dangerous manipulation of people’s subjective experience.
It is shocking to not that Facebook did it secretly and this may be legal but not ethical to make people sad by feeding them more of negative news and it can also impact the oversensitive individuals which may get depressed due to this.
Facebook defends the study that tech companies are constantly making changes that apply to some people and not others & Secondly, Facebook’s primary business is advertising. The very point of ads is to influence emotions and behavior, and they often use data and psychological insights to do so. The news feed isn’t a product that’s somehow “neutral,” it’s constructed with a particular goal in mind.
Interesting to note that Facebook routinely adjusts its user’s news feeds – testing out the number of ads they see or the size of photos, links, videos that appear – often without their knowledge. It is all for the purpose, the company says, of creating a more alluring and useful product.
Sociologist Elizabeth Popp Berman writes “FB experiments with what it shows you in order to understand how you will react,”. “That is how they stay in business.”
It is more of an ethical question & ethical obligations in the conduct of research that go beyond what we would expect from a company to do.
Facebook’s sheer size and ubiquity mean that it has access to more data and that any changes have far more impact, than just about any other researcher can imagine. The intimacy and scale of that relationship is one of the reasons that the reaction has been so intense.
Facebook’s response to Global Out-cry“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”
How actually Facebook Works & what happened with the Experiment
The researchers found that moods were contagious. The people who saw more positive posts responded by writing more positive posts. Similarly, seeing more negative content prompted the viewers to be more negative in their own posts.
Although academic protocols generally call for getting people’s consent before psychological research is conducted on them, Facebook didn’t ask for explicit permission from those it selected for the experiment. It argued that its 1.28 billion monthly users gave blanket consent to the company’s research as a condition of using the service.
What you see is chosen by a mysterious algorithm that takes into account hundreds of factors, such as how often you comment on your Aunt Sally’s photos, how much your friends are talking about a colleague’s post about her new job, and whether you always watch those cat videos.
Facebook also promotes direct feedback. On the desktop version of the website, for example, if you click on the arrow at the top right corner of every post, there is an option to “Make news feed better” by rating your satisfaction with various posts so users can get what they like in the news feed.
In the end, question arises, people like Facebook to be an unrestricted free platform to share things with their loved ones, not a manipulated place where they are puppets of Facebook’s algorithm. More restricted or controlled Facebook becomes to earn higher revenues more people will start moving to free platforms where they can express themselves without being controlled.