By Tim Bradshaw in San Francisco Author alerts
©Dudau/Dreamstime
Facebook has angered users after it emerged that a psychology experiment was conducted on hundreds of thousands of the social network’s members without their awareness or consent.
A week-long study of more than 689,000 Facebook users in 2012 found that those who were exposed to fewer positive stories when they visited the site were more likely to write negative posts, and vice versa.
Did Facebook act unethically in conducting a psychology experiment on users without consent?
“The experiment manipulated the extent to which people were exposed to emotional expressions in their news feed,” the researchers wrote in their report. “These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.” The research, which was published earlier this year, was jointly authored by Adam Kramer, who works in Facebook’s core data science team, Jamie Guillory, a postdoctoral fellow at the University of California in San Francisco, and Jeffrey Hancock, a professor at Cornell University. “Given the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences,” the researchers concluded. “Online messages influence our experience of emotions, which may affect a variety of offline behaviours.”
A report about the research in the New Scientist magazine prompted a backlash this weekend, with most of the complaints centred on the fact that participants in the study did not provide informed consent. “Facebook manipulated the emotions of its users. Unethical? Yes. 1984? Yes,” tweeted Jacob Shiach, founder of Brightwork CoResearch, a research space in Houston, Texas, referencing the George Orwell novel about an oppressive totalitarian government. Susan Fiske, a professor of psychology at Princeton University who edited the study for publication in the Proceedings of the National Academy of Sciences of America, said that she initially had ethical concerns. “I think part of what’s disturbing for some people about this particular research is you think of your news feed as something personal,” she told The Atlantic magazine. “I had not seen before, personally, something in which the researchers had the co-operation of Facebook to manipulate people.”
Facebook defended the study, saying “none of the data used was associated with a specific person’s Facebook account” and there is “no unnecessary collection of people’s data” in such initiatives. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible,” Facebook said. “A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process.”
The researchers said that they had detailed just one of many tests conducted by Facebook to improve the “ranking algorithm” of what content is shown in the news feed. This is not the first time that Facebook has received complaints over how it filters content in the news feed, most users’ main channel for viewing updates from friends and from brands’ fan pages.
Advertisers have protested against recent moves by Facebook to prioritise individual users’ posts over those from their brand pages, which means they have to buy advertisements to promote their material, even to existing customers or “fans” who have “liked” their products. Privacy advocates have frequently complained over the past few years that Facebook has automatically opted users into new features that might share more information than they realised, rather than asking permission upfront.
But the latest backlash comes at a time when Facebook has been working to improve its reputation for handling personal information. Last month, it began to offer users a “privacy check-up”, through online prompts featuring a blue cartoon dinosaur, and changed the default setting for a user’s first post to be seen only by friends, rather than making it open to the public. Marc Andreessen, a Silicon Valley investor who sits on Facebook’s board, insisted that what the social network was doing was not unusual.
“Helpful hint: whenever you watch TV, read a book, open a newspaper, or talk to another person, someone’s manipulating your emotions!” he tweeted. “The entire Facebook system is designed to lead to positive posts and interactions.”
Additional reporting by Hannah Kuchler
Related Topics
Facebook Inc,
Social Media
Copyright The Financial Times Limited 2014. You may share using our article tools.
Please don’t cut articles from FT.com and redistribute by email or post to the web.
This is and has been, since inception, one of the creepiest companies and creepiest CEO’s in the world. And it just managed to get creepier. The only innovation here is marketing hype. New levels of it. What a phenomenal load of steaming horseshit. As Orwell said: “Marketing is the rattling of a stick in a swill bucket.” People really should boycott these companies. Don’t be digital sheep — it’s your dollars that they want. (Or personal, salable/actionable information). Don’t give it to them. Find a competing product with good management and a value system that isn’t completely off the wheel.
Start here: These guys suck. Burn them down.
All:
God save your majesty!
Cade:
I thank you, good people—there shall be no money; all shall eat
and drink on my score, and I will apparel them all in one livery,
that they may agree like brothers, and worship me their lord.
Dick:
The first thing we do, let’s kill all the lawyers.
Cade:
Nay, that I mean to do.
From: Henry the Eighth.
William Shakespeare, Writer