Stay on Top of Enterprise Technology Trends
Get updates impacting your industry from our GigaOm Research Community
Hot on the heels of its sweeping apology to the LGBT community, Facebook has apologized to its users for conducting emotional experiments on them. You may recall that back in June, the social media site found itself in hot water for conducting a study in which it filled some users’ news feeds with positive posts and some with negative posts to see whether it impacted the tone of these people’s subsequent posts. It was essentially a test of emotion virality.
The backlash from that study caught Facebook by surprise and it scrambled to respond to its users’ anger at becoming de facto guinea pigs. It was an important learning moment for the company. “We were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism,” Facebook CTO Mike Schroepfer wrote Thursday. “It is clear now that there are things we should have done differently.”
Along with its apology, Facebook elaborated on how it will conduct such research in the future. It has introduced guidelines for how user studies will occur, a formal review process internally for studies of sensitive topics like emotions and a public-facing website to keep people informed on the experiments Facebook is running.
It’s not clear from the post whether there will be any external policing of Facebook’s tests by unbiased third parties. When I reached out to Facebook for comment, a spokesperson responded with the following statement, “We designed the training and the review process to help everyone make good decisions when designing tests or research. We also work with a number of outside academics in areas like computational social science and privacy, so we may consult with these people if we face particularly challenging reviews.”
Facebook says its internal study review team will be comprised of lawyers, engineers, and privacy experts. That sounds hunky-dory, but outside accountability matters, since anyone who works for Facebook may be too close to the company to accurately determine ethical practices. For academic researchers, there are clear ways of doing things determined by industry organizations, external review boards, and even federal law.
It’s a step in the right direction for the company, but without additional systems to hold Facebook accountable, it’s not quite enough.
This post has been updated with Facebook’s comment.