Facebook has admitted to altering page content for thousands of its users in 2012 as part of a psychology experiment, the results of which were published in the Proceedings of the National Academy of Sciences journal.
Dubbed 'experimental evidence of massive-scale emotional contagion through social networks
,' the research was carried out by Facebook employee Adam Kramer and the paper co-written by Kramer, Jamie Guillory and Jeffrey Hancock of the Centre of Tobacco Control Research and Education at the University of California San Francisco and the departments of Communication and Information Science at Cornell University respectively. The experiment sought to prove that emotional states can be transferred from person to person via what the researchers term 'emotional contagion
' - but the research was carried out without the knowledge of those being experimented upon.
During 2012, the team altered the Facebook News Feed of 0.4 per cent of Facebook users to see 'whether exposure to emotions led people to change their own posting behaviours, in particular whether exposure to emotional content led people to post content that was consistent with the exposure - thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion.
' Although 0.4 per cent may not sound like much, Facebook's massive user-base meant that 689,003 individual users - all from the company's English-language users - were affected during the experiment.
The modifications made by Kramer during the research saw content which had been rated as being emotionally negative being dropped down the News Feed in favour of more positive content. Although no posts were deleted, this altered prioritisation made it significantly less likely that a user unknowingly participating in the trial would see posts with negative emotional content.
Although the research, which can be read in full via a PDF download
, appears largely benign, many of the company's users are up in arms about being used as guinea pigs. Although Facebook's policies on data usage allow for research, it specifically restricts this to internal operations - something the publicly-published paper written with two non-Facebook researchers clearly is not.
Kramer has apologised for 'the way the paper described the research and any anxiety it caused,
' although has not apologised for carrying out the research without allowing selected users the change to opt out nor for the sharing of the results outside Facebook in apparent breach of the company's data usage policies. Facebook itself has issued a statement claiming that 'we carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely,
' while also pointing out that the data used in the study was not personally identifiable.