Data Policy Implicates Unethical Facebook Emotion Experiment

FACEBOOK

There is a new development in the debate over the legality and ethics of Facebook’s (NASDAQ:FB) News Feed experiment, which Wall St. Cheat Sheet reported on earlier this week. Compounding the (very certain) unethical nature of Facebook’s experiment, is a new discovery that makes it impossible to defend the research as strictly legal: Forbes’s Kashmir Hill reports that Facebook didn’t include the term “research” in the official data use policy that users agree to until four months after the experiment was completed.

A team of researchers, led by Facebook data scientist Adam Kramer, conducted a week-long experiment early in 2012 to manipulate the posts that appeared in the News Feeds of 689,003 Facebook users — none of whom were aware that they were part of an experiment. They altered the algorithm that chooses which statuses, photos, and activities to display so that it would show some users fewer posts with negative words, and show others fewer posts with positive words. The objective was to learn whether emotions could spread via social media, and the study demonstrated that they can, with users exposed to fewer negative words more likely to use positive language.

However, debate has sprung up around the publication of the study’s results, with critics saying that Facebook should have gotten users’ consent, and supporters pointing out that the News Feed algorithm and the content it curates is manipulated constantly. A post on Animal New York took the side of critics of the research, and characterized Facebook’s experimenting on users as the company “using us as lab rats.”