Former Facebook data scientist confirms: If you use Facebook, you’ve been experimented on

Facebook Inc Announces Graph Search

For a small study that tweaked the newsfeeds of less than 1 percent of the Facebook population, the so-called “emotional contagion” experiment has triggered a wave of attention — most of it negative — aimed at the giant social network and how it may or may not be affecting its users. But if nothing else, the research has at least shone a spotlight on the potential risks of that kind of behavior. in one of the more recent developments, a former Facebook data scientist confirms what many users suspected all along: if you have ever used Facebook, then you have probably been experimented on. Whether that’s a bad thing is still up for debate.

Andrew Ledvina was quoted in a Wall Street Journal followup on the emotional contagion study, which removed certain phrases from the newsfeeds of about 700,000 users to see if it changed their behavior (there’s more background on the study in our “what you need to know” post and in a similar roundup of facts by The Atlantic). The Journal story detailed how Facebook routinely performs similar research, and has little or no oversight in the form of ethical advisory boards etc. It quoted Ledvina as saying: “There’s no review process, per se. Anyone on that team could run a test. They’re always trying to alter peoples’ behavior.”

A constant state of experimentation

According to a blog post he published late last week, the former data scientist — who says he left the social network last year for unrelated reasons — was unhappy with the way his comments were portrayed in the newspaper, so he decided to expand on them in the hope of explaining the issue. In the process, he confirmed that Facebook is more or less in a constant state of experimenting on its users in thousands of different ways, and for better or worse, very little of that is subject to any kind of review or approval process:

“If you want to run a test to see if people will click on a green button instead of a blue button you don’t need approval. In the same way, if you want to test a new ad targeting system to see if people click on more ads and revenue goes up, you don’t need institutional approval. Further, if you want to see how people react to changes in certain systems like the content in news feed, you don’t need approval for that experiment, even if it is just to help inform an internal ranking system.”

research_ethics

Image courtesy of xkcd, http://xkcd.com/1390/

Ledvina goes on to say that the reason for this kind of experimentation is fairly obvious: to make the site better for users, to increase their engagement with the content, to convince them to spend more time on the site, to get them to click on ads, and so on. Like many of those who have supported the social network’s decision to do research like the emotional contagion study — or the similar voting behavior study that some find even more disturbing — Ledvina argues that experimentation is not only necessary but good.

“Experiments are run on every user at some point in their tenure on the site. Whether that is seeing different size ad copy, or different marketing messages, or different call to action buttons, or having their feeds generated by different ranking algorithms, etc. The fundamental purpose of most people at Facebook working on data is to influence and alter people’s moods and behaviour. They are doing it all the time to make you like stories more, to click on more ads, to spend more time on the site.”

Everyone is toying with your emotions

As my colleague Tom Krazit has pointed out, one of the tangible effects of this kind of experimentation is that Facebook users can no longer count on the idea that “the news will find me,” because the newsfeed algorithm is constantly being revised. In a sense, Facebook is doing the same thing that media companies like the New York Times or BuzzFeed are doing: trying to find the right combination of content and social triggers that will encourage users to read and/or share (the NYT even has a chief data scientist whose job consists of helping with that kind of research).

In a post about the ramifications of the emotional-contagion study, sociologist and Microsoft Research fellow danah boyd (who spells her name without upper-case letters) admitted that she dislikes the way Facebook manipulates the newsfeed, saying: “I hate the fact that Facebook thinks it’s better than me at deciding which of my friends’ posts I should see. I hate that I have no meaningful mechanism of control on the site.” But she also defends the network, arguing that if we are going to get upset about Facebook toying with our emotions, then we should be upset about others doing so as well, including the mainstream media:

“We as a society believe that it’s totally acceptable for news media?—?and its click bait brethren?—?to manipulate people’s emotions through the headlines they produce and the content they cover. And we generally accept that algorithmic curators are perfectly well within their right to prioritize that heavily clicked content over others, regardless of the psychological toll on individuals or the society. What makes their practice different?”

In the end, boyd and others — including fellow Microsoft researcher Duncan Watts — argue that the best outcome of the recent controversy would be not to stop sites like Facebook from doing research into human behavior, but to encourage them to make more of it public so that we can all benefit. As Watts points out, the ability to use up to a billion people as potential test subjects is unprecedented. The only downside of the current state of affairs, he and others argue, is that Facebook and other networks keep the vast majority of that information to themselves.

Post and thumbnail images courtesy of Thinkstock/ Stephen Lam and XKCD

loading

Comments have been disabled for this post