bit-tech.net

Facebook hit by privacy furore over emotion experiment

Facebook hit by privacy furore over emotion experiment

Almost 700,000 Facebook users had their News Feeds modified without their knowledge or consent in 2012 as part of an experiment into emotional contagion published in the PNAS journal.

Facebook has admitted to altering page content for thousands of its users in 2012 as part of a psychology experiment, the results of which were published in the Proceedings of the National Academy of Sciences journal.

Dubbed 'experimental evidence of massive-scale emotional contagion through social networks,' the research was carried out by Facebook employee Adam Kramer and the paper co-written by Kramer, Jamie Guillory and Jeffrey Hancock of the Centre of Tobacco Control Research and Education at the University of California San Francisco and the departments of Communication and Information Science at Cornell University respectively. The experiment sought to prove that emotional states can be transferred from person to person via what the researchers term 'emotional contagion' - but the research was carried out without the knowledge of those being experimented upon.

During 2012, the team altered the Facebook News Feed of 0.4 per cent of Facebook users to see 'whether exposure to emotions led people to change their own posting behaviours, in particular whether exposure to emotional content led people to post content that was consistent with the exposure - thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion.' Although 0.4 per cent may not sound like much, Facebook's massive user-base meant that 689,003 individual users - all from the company's English-language users - were affected during the experiment.

The modifications made by Kramer during the research saw content which had been rated as being emotionally negative being dropped down the News Feed in favour of more positive content. Although no posts were deleted, this altered prioritisation made it significantly less likely that a user unknowingly participating in the trial would see posts with negative emotional content.

Although the research, which can be read in full via a PDF download, appears largely benign, many of the company's users are up in arms about being used as guinea pigs. Although Facebook's policies on data usage allow for research, it specifically restricts this to internal operations - something the publicly-published paper written with two non-Facebook researchers clearly is not.

Kramer has apologised for 'the way the paper described the research and any anxiety it caused,' although has not apologised for carrying out the research without allowing selected users the change to opt out nor for the sharing of the results outside Facebook in apparent breach of the company's data usage policies. Facebook itself has issued a statement claiming that 'we carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely,' while also pointing out that the data used in the study was not personally identifiable.

13 Comments

Discuss in the forums Reply
Corky42 30th June 2014, 11:25 Quote
I don't see why people are up in arms about being used as guinea pigs, it's not like names have been named, it doesn't seem any harm was done by tinkering with the news feeds.

I would be more concerned with the government monitoring their own citizens on social media, or "research" company's using our tax and medical records for financial gain.
GreekUser 30th June 2014, 13:14 Quote
What is really annoying about it is that we never find out about all the other experiments that Facebook does for itself. And even more that none seem to care about being a guinea pig for that multinational, personal information colecting company...
Impatience 30th June 2014, 13:20 Quote
I'd prefer to know what i'm taking part in.. Possibly. At least i'd know what I may be helping contribute towards!
ArcAngeL 30th June 2014, 13:21 Quote
although i don't particularly care... i can see how this is not cool.

Scenario, Fiona, has depression, her facebook feed is then targeted to potentially bring on further depression, becoming the icing on the cake to her potential self harm.
nightblade628 30th June 2014, 13:59 Quote
Personally, I could do with a feature that drops negative posts further down my News Feed. In fact, if they could find a way to completely eliminate the "OMG my life iz so unfare mum buyed me da mercades an i wanted da bmW FML #whatislife #the99% #YOLO" posts I'd pay Facebook myself. The day they allow me to press a button and have those people punched with a deck chair that flies out of their computer screen is the day I can die laughing a happy man.
Corky42 30th June 2014, 14:00 Quote
Quote:
Originally Posted by Impatience
I'd prefer to know what i'm taking part in.. Possibly. At least i'd know what I may be helping contribute towards!

But then that could have made the results invalid.
Quote:
Originally Posted by ArcAngeL
Scenario, Fiona, has depression, her facebook feed is then targeted to potentially bring on further depression, becoming the icing on the cake to her potential self harm.

If they had pushed negative results to the top you would have a point, but as they done the opposite they made the world a happier place
Anfield 30th June 2014, 14:35 Quote
Quote:
Originally Posted by nightblade628
In fact, if they could find a way to completely eliminate the "OMG my life iz so unfare mum buyed me da mercades an i wanted da bmW FML #whatislife #the99% #YOLO" posts I'd pay Facebook myself.

Just be more selective who you add as a friend and refuse anyone you haven't known for a long time in real life.
Stanley Tweedle 30th June 2014, 14:37 Quote
Or request your facebook account be deleted and do other stuff instead like ride bicycle and grow watermelon.
Cheapskate 30th June 2014, 19:34 Quote
[foil hat] Anyone else troubled that a group that likely had a government grant is studying ways to manipulate the population?[/foil hat]
mi1ez 1st July 2014, 00:30 Quote
Quote:
Originally Posted by Corky42
I don't see why people are up in arms about being used as guinea pigs, it's not like names have been named, it doesn't seem any harm was done by tinkering with the news feeds.

This is pretty much exactly my thinking. They're constantly manipulating newsfeeds for their own gain, so the fact they're doing it for a social experiment is, if anything, an improvement!
mi1ez 1st July 2014, 00:31 Quote
Quote:
Originally Posted by Anfield
Quote:
Originally Posted by nightblade628
In fact, if they could find a way to completely eliminate the "OMG my life iz so unfare mum buyed me da mercades an i wanted da bmW FML #whatislife #the99% #YOLO" posts I'd pay Facebook myself.

Just be more selective who you add as a friend and refuse anyone you haven't known for a long time in real life.

But she promised me naked pictures!
debs3759 1st July 2014, 20:57 Quote
Quote:
Originally Posted by Corky42
I don't see why people are up in arms about being used as guinea pigs, it's not like names have been named, it doesn't seem any harm was done by tinkering with the news feeds.

I don't care whether they have named any of their guinea pigs. The fact is they decided what news people were allowed to view for the duration of the experiment. I do not believe anybody has the right to tell me what I can see on my newsfeed. My Facebook friends are real life friends. Pages I like and groups I subscribe to I do so because I want the information they share. What gives anyone the right to tell me which posts I can view?

It's bad enough that they keep changing my news view to what they call the top stories. Surely they don't think I am incapable of deciding for myself which are the best stories?
Quote:
Originally Posted by Impatience
I'd prefer to know what i'm taking part in.. Possibly. At least i'd know what I may be helping contribute towards!

Likewise. I do not want to be used in any experiment that I did not give permission for, except perhaps something psychopharmacological :)
Corky42 2nd July 2014, 07:13 Quote
Quote:
Originally Posted by debs3759
I don't care whether they have named any of their guinea pigs. The fact is they decided what news people were allowed to view for the duration of the experiment. I do not believe anybody has the right to tell me what I can see on my newsfeed. My Facebook friends are real life friends. Pages I like and groups I subscribe to I do so because I want the information they share. What gives anyone the right to tell me which posts I can view?

It's bad enough that they keep changing my news view to what they call the top stories. Surely they don't think I am incapable of deciding for myself which are the best stories?

But they did not decided what new people could view, they just pushed the positive news stories to the top of the list. As it says in the article "content which had been rated as being emotionally negative being dropped down the News Feed in favour of more positive content."

Unlike some governments they did not dictate what you were allow to view, they didn't block any content, it was all still there just lower down the list.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums