The Facebook's Emotional Contagion Study

Date:  2021-03-06 05:06:34
4 pages  (1213 words)
Back to categories
logo_disclaimer
This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
logo_disclaimer
This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

The emotional contagion study was carried out in the year 2012 by Facebook and the Cornell University to show the effect of positive and the negative impact of the news feed to the emotions of users. The study was carried out for one week and revealed that the news feeds in the user's account control their emotions. The effect on emotions in turn affects what people post (Kramer et. al., Para 1-17). The study done on a sample size of 689,003 controlled and manipulated the feeds on the users' accounts without their knowledge. This study raised several ethical issues, and it became the subject of discussion among researchers. Though some of the researchers considered the study as unethical, others were of the opinion that the study was ethical.

There is the opine that Facebook breached the professional ethics owed to the public by inflicting feelings contrary to what the users had anticipated. According to other researchers, the study was morally wrong and socially unacceptable. The Facebook management did not inform the 689,003 users that they were being used as a sample population for the study. (Tufekci, Para 1-14). The management allowed the presumption of consent by the researchers without putting into considerations the views and perceptions of the users. Professionally, the proponents claim, it is unethical. Socially, it is degrading the importance of the consumer and the decent power one has on his or her feelings. Given this, I think that the study was unethical, but the essay will focus on giving arguments for the study being ethical. The essay will focus on the importance of algorithms in the lives of the Facebook users.

To begin with, the algorithm on Facebook is an important tool to the user of Facebook. The purpose of the algorithm on Facebook is to control the information displayed and the updates on the user's News Feed Page. It also relates to the emotions of users by filtering off the information that is contrary to the beliefs of the user (Kramer et. al., Para 1-17). It tracks the posts that the users post on their page, the posts they like from their friends' posts as well as comments made by other users. By doing so, the Facebook Algorithm control the information that users use to socialize either negatively or positively. This alone has what prompted the researchers to do the study so as to find out the effects of the negative and positive comments on the posts that users posted.

Secondly, the research done in 2004 showed that the News Feed Algorithms improve the experiences of the new and inactive members who have few posts on their Facebook page (Forlani, Para 1-20). The rule that was previously used indicated that the more posts that a user updated in their profile, the more information of other subscribers posts in the News Feed. It is, therefore, clear that despite the criticisms, the study was beneficial in that the improvements made to the algorithm were beneficial to the lives of inactive users socially as little of the information is filtered.

Also, the algorithm determines the people who are going to view a user's post. The algorithm system does this by tracking the friends in the users' page the instant a person posts the status. An example is when the user posts on their page and all the friends, followers and those that the user follows, can see the information posted on the page. In the event when a user has privatized their account, the algorithm functions by requesting the user to confirm whether they want their posts to be all friends or just specific friends. To further this effect, it is advantageous in that it ensures that the privacy the user envisages is maintained.

Researchers have overtime made conclusive views that the algorithm diminishes diverse ideas and cross-cutting data, as well as information people, see the Facebook page. A user, for example, will see few items that they hold an opinion against and more of that news that they support. According to Christian Sandvig, the algorithm in its function puts one out of twenty hard news conservative stories and one in thirteen liberal stories that a user supports. The effect this has on the social lives of the users is that information that seems crucial in their imagination is withdrawn. This filtering of information makes the users miss out on diverse data information that would have benefited their social lives (Kramer et. al., Para 1-17).

It is pity that the proponents of the argument that the study is unethical to seem wanting, farfetched and with opinions that are an impediment to the technological circle, especially the social networking developments. First, the sentiment that posed by the proponents regarding consent seems utopian. To be factual, I don't think there is a normal person with guts to pretend to be a Guinea pig regardless of whether the experiment is free or otherwise. Realistically, for the 689,003 people to have been used as subjects in the emotional contagion experiment by Facebook, the company's management ought to have noticed and curbed the irregularities as expected.

Moreover, researchers indicated that most online companies including Facebook often do these studies for their benefit and also for social experiments. For example, the study would have needed a contract, which needed legal fees etcetera, and the process would have consumed a lot of time yet the only main agenda of the research was to improve the quality of service given to the customers. The researchers think that in such studies the online companies does not need the users' consent. According to the researchers, the studies have been done over time and no one cared until Facebook did the study and printed it in the scientific journal. The researchers also believe that by involving the scientists from the Cornell University, they made the study public and transparent as well as beneficial for future research. Researchers also warned that a backlash on the study would dissuade such online companies from working and involving scientists in their future studies.

In summary, it is incorrect referring to the one-week 2012 study as unethical due to the above-mentioned and explained positive roles that the News Feed Algorithms play. Accordingly, it is sensible to appraise where the need is and avoid pointing fingers based on such illogical excuses as incorrectness of manipulating peoples feelings. Despite the weight of any criticism, numbers don't lie, and results of any projects always validate its operations. Since the project in question is associated with social networking, the aspect of consent is obsolete when it comes to using the consumer as the main unit of analysis for examining the quality of services offered.

Work Cited

Forlani, Christina. 'Three Changes to Facebook's Algorithm'. We are Social 2015. Available at <http://wearesocial.net/blog/2015/05/facebooks-algorithm/ >. Accessed on 2 Oct. 2015.

Kramer, A. D. I., J. E. Guillory, and J. T. Hancock. 'Experimental Evidence Of Massive-Scale Emotional Contagion Through Social Networks'. Proceedings of the National Academy of Sciences 111.24 (2014): 8788-8790. Available at<http://www.pnas.org/content/111/24/8788.full#xref-ref-2-1> .Accessed on 2 Oct. 2015.

Tufekci, Zeynep. 'How FacebookS Algorithm Suppresses Content Diversity (Modestly) & How The Newsfeed Rules The Clicks The Message'. Medium. N.p., 2015.Available at <https://medium.com/message/how-facebook-s-algorithm-suppresses-content-diversity-modestly-how-the-newsfeed-rules-the-clicks-b5f8a4bb7bab> . Accessed on 2 Oct. 2015.

 

logo_essay logo_essay

Request Removal

If you are the original author of this essay and no longer wish to have it published on the ProEssays website, please click below to request its removal: