Facebook Proves Itself To Be Another CIA-Managed Media Experiment

facebookCIA

Facebook let shrinks MESS WITH YOUR HEAD, sans permission

T&C click allowed boffins to re-order news feed for emotional response experiment

By Richard Chirgwin, 29 Jun 2014

Facebook has let psychology researchers mess with users’ feeds to deliberately manipulate their emotions – along the way letting loose the suggestion that such behaviour is routine, which is how the project got through ethics committees.

In 2012, the researchers, led by the company’s data scientist Adam Kramer, manipulated which posts from their friends the sample of nearly 700,000 users could see in their “News feed”, suppressing either positive or negative posts, to see whether either cheerful or downer posts seemed to be emotionally contagious.

With only a user’s “agree click” on Facebook’s terms and conditions document to provide a fig-leaf of consent to the creepy experiment, researchers from Cornell University, the University of California San Francisco manipulated users’ news feeds.

Let’s hear from the university:

“The researchers reduced the amount of either positive or negative stories that appeared in the news feed of 689,003 randomly selected Facebook users, and found that the so-called ’emotional contagion’ effect worked both ways.”

Cornell Social Media Lab professor Jeff Hancock reports the simple correlation that turned up: “People who had positive content experimentally reduced on their Facebook news feed, for one week, used more negative words in their status updates. When news feed negativity was reduced, the opposite pattern occurred: Significantly more positive words were used in peoples’ status updates.”

The study, which was published by PNAS here (the US National Academy of Sciences seems similarly unconcerned by the ethical concerns surrounding the experiment), describes the methodology like this in its abstract:

“We test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks” (emphasis added).

The researchers state clearly that “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research” (emphasis added).

Media reception of the research has been almost universally hostile – put “Facebook” together with “creepy” into Google News for a sample. Noting that the research “seems cruel”, The Atlantic says even PNAS editor Susan Fiske found the study “creepy”, but let the study pass because “Facebook apparently manipulates people’s News Feeds all the time”.

Vulture South isn’t so sure the researchers can give themselves a clean bill of health. In particular, a data use policy most users don’t read, and which would confuse many, seems inadequate as a definition of “informed consent”.

Moreover, while the data use policy does, as the researchers state, permit the use of data for “research”, the context doesn’t seem in full agreement with the researchers’ interpretation: “we may use the information we receive about you”, it states, “for internal operations, including troubleshooting, data analysis, testing, research and service improvement”.

In other words, “research” is offered not as a blanket permission, but as an example of “internal operations”.

The Atlantic notes that as a private company, Facebook isn’t bound by the ethical rules that constrain academic research – something which doesn’t exempt the two universities involved, nor the National Academy of Sciences.

Kramer has sought to explain himself in this post. He reassuringly claims that “our goal was never to upset anyone”.

Kramer adds that “the research benefits of the paper may not have justified all of this anxiety”, and says the experiment only adjusted the emotional content of News feeds in the positive direction.

He said posts weren’t technically “hidden” since users could go looking for friends’ posts on the friends’ pages. However, the issue of consent remains unaddressed by Kramer.

Even the methodology of the study has come in for criticism. In this summary of the story, The Atlantic points to a critique by John Grohl at Psych Central.

Grohl says the research used textual analysis tools that were unsuited to the task set for them – discerning mood by extracting words from posts – and describes the results as a “statistical blip that has no real-world meaning”.

This entry was posted in Uncategorized. Bookmark the permalink.