IS 677 Group 1. Did Facebook and PNAS violate human research protections in an unethical experiment? Facebook unethical experiment: It made news feeds happier or sadder to manipulate people’s emotions. Photo by Karen Bleier/AFP/Getty Images Facebook has been experimenting on us.
A new paper in the Proceedings of the National Academy of Sciences reveals that Facebook intentionally manipulated the news feeds of almost 700,000 users in order to study “emotional contagion through social networks.” Katy Waldman is Slate’s words correspondent. The researchers, who are affiliated with Facebook, Cornell, and the University of California–San Francisco, tested whether reducing the number of positive messages people saw made those people less likely to post positive content themselves.
The same went for negative messages: Would scrubbing posts with sad or angry words from someone’s Facebook feed make that person write fewer gloomy updates? They tweaked the algorithm by which Facebook sweeps posts into members’ news feeds, using a program to analyze whether any given textual snippet contained positive or negative words. The upshot? Facebook emotion study breached ethical guidelines, researchers say.
Researchers have roundly condemned Facebook's experiment in which it manipulated nearly 700,000 users' news feeds to see whether it would affect their emotions, saying it breaches ethical guidelines for "informed consent".
James Grimmelmann, professor of law at the University of Maryland, points in an extensive blog post that "Facebook didn't give users informed consent" to allow them to decide whether to take part in the study, under US human subjects research. "The study harmed participants," because it changed their mood, Grimmelmann comments, adding "This is bad, even for Facebook. " Was the Facebook emotion experiment unethical? By Bethany Albertson and Shana Gadarian July 1 ( Karen Bleir/AFP/Getty Images) There has been lots of news coverage of Facebook’s manipulation of newsfeed content to alter the balance of positive or negative words and the accompanying academic paper that was published in the Proceedings of the National Academy of Sciences.
As two researchers who study emotions, we have watched the fallout from this study with great interest. While many critics have wondered whether the study’s researchers had approval from a university Institutional Review Board (IRB), the watchdog organization that oversees faculty research that involves human subjects (the short answer is sort of), this focus is misguided. IRBs are not guardians of ethics; they are protectors of institutional legal liability. If we can agree that evaluating the Facebook research isn’t as simple as determining whether they had IRB approval, then what else might we need to consider? Was Facebook's 'Emotional Contagion' Experiment Ethical? An academic study has come under criticism because its authors manipulated Facebook users' news feeds in order to gather data.
The researchers, including one who worked for Facebook, admitted last week that they studied the parallel between an individual's emotions and the emotions portrayed on a news feed by manipulating the feeds of about 700,000 users. Over one week in January 2012, researchers eliminated "positive" posts from some users' news feeds and eliminated "negative" posts from others, to see if doing so had an effect on the users' moods. The authors of the study have drawn criticism for failing to ensure that the study was consensual, for violating users' privacy and for manipulating users' lives. The authors defend themselves, saying that the method is made permissible by Facebook's Data Use Policy.
[SEE: Political Cartoons] The study found that "emotional contagion occurs without direct interaction between people ... and in the complete absence of nonverbal cues. " Four Ethical Issues of the Information Age) By Richard O.
Mason Today in western societies more people are employed collecting, handling and distributing information than in any other occupation. Millions of computers inhabit the earth and many millions of miles of optical fiber, wire and air waves link people, their computers and the vast array of information handling devices together. Our society is truly an information society, our time an information age. The question before us now is whether the kind of society being created is the one we want. Facebook could face €100,000 fine for holding data that users have deleted. Facebook could face a fine of up to €100,000 (£87,000) after an Austrian law student discovered the social networking site held 1,200 pages of personal data about him, much of which he had deleted.
Max Schrems, 24, decided to ask Facebook for a copy of his data in June after attending a lecture by a Facebook executive while on an exchange programme at Santa Clara University in California. Schrems was shocked when he eventually received a CD from California containing messages and information he says he had deleted from his profile in the three years since he joined the site. After receiving the data, Schrems decided to log a list of 22 separate complaints with the Irish data protection commissioner, which next week is to carry out its first audit of Facebook.
He wrote to Ireland after discovering that European users are administered by the Irish Facebook subsidiary. Facebook Faces Investigation and £500,000 Penalty in UK for Secret Manipulation of News Feeds. Facebook Could Face Investigation and 500,000 Pounds Penalty in UK for ‘Secret’ User News Feeds Manipulation Social networking giant Facebook's secret study to analyse users' emotional responses could face investigation in the United Kingdom for breaching data privacy laws.
According to a Financial Times report , the Information Commissioner's Office (ICO) in UK plans to quiz Facebook about the 2012 experiment that led to news feeds of nearly 700,000 users being manipulated by engineers. According to a spokesperson for the ICO, it is too early to say which part of the data protection law Facebook may have infringed by conducting its experiment. The ICO can conduct independent investigations on individuals or organisations that are involved in data or privacy law breaches, and the regulator can also impose fines ranging up to £500,000, reports Reuters. Companies found guilty during the probe would also need to change their data protection and privacy policies. Facebook Admits Social Experiment Should Have Been 'Done Differently' Facebook announces new guidelines for psych experiments Facebook has admitted that the 2012 psychological experiment it conducted on its users without their knowledge should have been "done differently", but added that it will continue to use the platform for research.
In a blog post Thursday, Facebook CTO Mike Schroepfer said with regard to the infamous Happiness experiment that future social science research would be governed by new guidelines. In June of this year, Facebook published the results of a 2012 study in which the social network altered the news feeds of nearly 70,000 users, showing them alternately positive or negative posts to see what emotional affect that had. It turns out people who were shown more negative comments posted more negative comments, while viewing positive posts elicited positive comments.
Schroepfer said the company was "unprepared" for the backlash that followed. Experimental evidence of massive-scale emotional contagion through social networks. Author Affiliations Edited by Susan T.
Fiske, Princeton University, Princeton, NJ, and approved March 25, 2014 (received for review October 23, 2013) Significance We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.
Abstract Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.