7 Comments

Summary:

Facebook is constantly experimenting on its users by tweaking the newsfeed in both large and small ways, a former member of the company’s data science team confirms, and some of the social network’s defenders argue that it isn’t doing anything media companies don’t also do

For a small study that tweaked the newsfeeds of less than 1 percent of the Facebook population, the so-called “emotional contagion” experiment has triggered a wave of attention — most of it negative — aimed at the giant social network and how it may or may not be affecting its users. But if nothing else, the research has at least shone a spotlight on the potential risks of that kind of behavior. in one of the more recent developments, a former Facebook data scientist confirms what many users suspected all along: if you have ever used Facebook, then you have probably been experimented on. Whether that’s a bad thing is still up for debate.

Andrew Ledvina was quoted in a Wall Street Journal followup on the emotional contagion study, which removed certain phrases from the newsfeeds of about 700,000 users to see if it changed their behavior (there’s more background on the study in our “what you need to know” post and in a similar roundup of facts by The Atlantic). The Journal story detailed how Facebook routinely performs similar research, and has little or no oversight in the form of ethical advisory boards etc. It quoted Ledvina as saying: “There’s no review process, per se. Anyone on that team could run a test. They’re always trying to alter peoples’ behavior.”

A constant state of experimentation

According to a blog post he published late last week, the former data scientist — who says he left the social network last year for unrelated reasons — was unhappy with the way his comments were portrayed in the newspaper, so he decided to expand on them in the hope of explaining the issue. In the process, he confirmed that Facebook is more or less in a constant state of experimenting on its users in thousands of different ways, and for better or worse, very little of that is subject to any kind of review or approval process:

“If you want to run a test to see if people will click on a green button instead of a blue button you don’t need approval. In the same way, if you want to test a new ad targeting system to see if people click on more ads and revenue goes up, you don’t need institutional approval. Further, if you want to see how people react to changes in certain systems like the content in news feed, you don’t need approval for that experiment, even if it is just to help inform an internal ranking system.”

research_ethics

Image courtesy of xkcd, http://xkcd.com/1390/

Ledvina goes on to say that the reason for this kind of experimentation is fairly obvious: to make the site better for users, to increase their engagement with the content, to convince them to spend more time on the site, to get them to click on ads, and so on. Like many of those who have supported the social network’s decision to do research like the emotional contagion study — or the similar voting behavior study that some find even more disturbing — Ledvina argues that experimentation is not only necessary but good.

“Experiments are run on every user at some point in their tenure on the site. Whether that is seeing different size ad copy, or different marketing messages, or different call to action buttons, or having their feeds generated by different ranking algorithms, etc. The fundamental purpose of most people at Facebook working on data is to influence and alter people’s moods and behaviour. They are doing it all the time to make you like stories more, to click on more ads, to spend more time on the site.”

Everyone is toying with your emotions

As my colleague Tom Krazit has pointed out, one of the tangible effects of this kind of experimentation is that Facebook users can no longer count on the idea that “the news will find me,” because the newsfeed algorithm is constantly being revised. In a sense, Facebook is doing the same thing that media companies like the New York Times or BuzzFeed are doing: trying to find the right combination of content and social triggers that will encourage users to read and/or share (the NYT even has a chief data scientist whose job consists of helping with that kind of research).

In a post about the ramifications of the emotional-contagion study, sociologist and Microsoft Research fellow danah boyd (who spells her name without upper-case letters) admitted that she dislikes the way Facebook manipulates the newsfeed, saying: “I hate the fact that Facebook thinks it’s better than me at deciding which of my friends’ posts I should see. I hate that I have no meaningful mechanism of control on the site.” But she also defends the network, arguing that if we are going to get upset about Facebook toying with our emotions, then we should be upset about others doing so as well, including the mainstream media:

“We as a society believe that it’s totally acceptable for news media — and its click bait brethren — to manipulate people’s emotions through the headlines they produce and the content they cover. And we generally accept that algorithmic curators are perfectly well within their right to prioritize that heavily clicked content over others, regardless of the psychological toll on individuals or the society. What makes their practice different?”

In the end, boyd and others — including fellow Microsoft researcher Duncan Watts — argue that the best outcome of the recent controversy would be not to stop sites like Facebook from doing research into human behavior, but to encourage them to make more of it public so that we can all benefit. As Watts points out, the ability to use up to a billion people as potential test subjects is unprecedented. The only downside of the current state of affairs, he and others argue, is that Facebook and other networks keep the vast majority of that information to themselves.

Post and thumbnail images courtesy of Thinkstock/ Stephen Lam and XKCD

  1. Momentum IM Monday, July 7, 2014

    Many of Facebook users (lets say its 10%, this is still some million people) do not understand that the feed they are getting in, is not what their friends share but what facebook really decides for them to see.
    So they dont even understand that there is an algorithm deciding that. This has proufound implications since it transforms facebook from a friend communication tool (the way most users use it) to a curated stream ridden with ads or not friend published content.
    In the news bussiness news and ads must be clearly separated. Not in the social network news feeds as it seems.
    Furthermore we will be seeing increasing mix of ad content in a way that it resembles friends postings in the future, in order to enhance click rates.

    But these happen where you are using free services that you are the product and the ad industry the customer!

    Reply Share
  2. Momentum IM Monday, July 7, 2014

    And if I may expand the fallacy of the social networks site’s is that the want to make you believe that you are the customer, when you are only the product.
    And also that you and your friends somehow own the news stream. They have imposed a business model that treats people as ad clicking, data generating, data mined bots.
    And what they really want by this experimentation is that they want to “to make the site better for advertisers, to curate content so to increase their engagements with ads, to convince them to spend more time on clicking on sponsored news feeds”

    The feed is not yours. Most users are still unaware of that. And the “user” flattering continues in a surprising double speak.

    Eventually all what the Social Network giants are speak and seem to care about is “their users’ not their customers, surprising for a business.
    This double speak is simply irritating!

    We should recognize this truth and try for something better, some of us do http://goo.gl/S7F3AV

    Reply Share
  3. you’re not gonna credit xkcd for that picture?

    Reply Share
    1. Hi — looks like the credit appeared at the bottom of the post, but I’ve also added one as a caption to the actual image. Thanks.

      Reply Share
  4. Warren Whitlock Monday, July 7, 2014

    The @Mims quote summed this up nicely.

    The bulk of what we hear about the Facebook News Feed is as reliable as the myths about slot payoffs.

    People base their opinions on what the newsfeed should be with all the baggage of how they view the world. Facebook has never inferred that they know what you want to see better than you know yourself and offers a reverse time order button on every feed page.

    I try it every month or so for a few hours or days and access some of the hidden content. It’s not better or worse, it’s just not what I am looking for.

    I am positive that Facebook’s intentions are never going to be the same as mine or yours. As long as I have the option to read Facebook or not and they keep trying to inform me about what they do, I’m glad they are doing whatever got all my friends to join up. I have zero illusion of them offering me a free service with no strings attached

    Reply Share
  5. Warren Whitlock Monday, July 7, 2014

    The @Mims quote summed this up nicely.

    The bulk of what we hear about the Facebook News Feed is as reliable as the myths about slot payoffs. (Google is, it’s mostly myth)

    People base their opinions on what the newsfeed should be with all the baggage of how they view the world. Facebook has never inferred that they know what you want to see better than you know yourself and offers a reverse time order button on every feed page.

    I try it every month or so for a few hours or days and access some of the hidden content. It’s not better or worse, it’s just not what I am looking for.

    I am positive that Facebook’s intentions are never going to be the same as mine or yours. As long as I have the option to read Facebook or not and they keep trying to inform me about what they do, I’m glad they are doing whatever got all my friends to join up. I have zero illusion of them offering me a free service with no strings attached

    Reply Share
  6. I wonder if MySpace ever experimented with anything like this? My guess is: probably not. And it turned into a junk hole. I link to this post from one I did recently: If Facebook Can Experiment, So Can You.

    Reply Share