3 Comments

Summary:

The more Facebook tries to control the News Feed in order to make sure the content in it is “high quality” enough, the more likely it is to irritate users who actually want to see or share the things Facebook defines as spam

Facebook has been trying for some time now to clean up or improve the News Feed by removing things it defines as “low quality,” and it announced another effort along those lines on Thursday — saying it will reduce the visibility of “like bait” and content that gets posted too often. But all of these efforts have a dilemma at their core: namely, how will Facebook differentiate between what it calls low-quality content and what users really want to see?

In its blog post on the announcement, Facebook says that like-bait is content that “explicitly asks News Feed readers to like, comment or share the post in order to get additional distribution beyond what the post would normally receive.” And how do we know whether those likes actually generate more sharing of that content than a post would otherwise receive? The short answer is we don’t. Only Facebook knows that, based on its black-box algorithms.

The network posted an example of what it means by like-bait: photos of a baby rabbit, a kitten, dolphins and a mosquito, posted by an account whose name is “When your teacher accidentally scrapes her nails on the chalkboard and you’re like whaaaaaat” (which would seem to break Facebook’s rules on real names, if nothing else). It asks users to like, share or comment — or ignore.

like-baiting-screenshot

There’s no question that many, perhaps even most, Facebook users would dislike this content intensely and vote to have it removed from their News Feed — except perhaps for younger users, who often enjoy that sort of thing, in part because it irritates adults. But I can think of other examples of content that might be considered like-bait that I saw friends willingly share, including photos of people fighting cancer who were trying to get a certain number of likes, and so on.

It might be spam, but I still like it

That kind of thing may not be “high quality” content, but some people clearly enjoy it. Part of Facebook’s dilemma can be seen in the blog post itself, when the company describes the difference between what people say when they fill out a survey, and what they actually do when they use the site — they click and share or comment, but then when asked, they say that they don’t like it.

“People often respond to posts asking them to take an action, and this means that these posts get shown to more people, and get shown higher up in News Feed. However, when we survey people and ask them to rate the quality of these stories, they report that like-baiting stories are, on average, 15% less relevant than other stories with a comparable number of likes, comments and shares.”

This is a little like the old days of TV analytics, where people would tell Nielsen that they only watched PBS and nature shows — but when Nielsen switched from surveys to actual monitoring software that tracked what people watched, it found that people’s viewing habits were dramatically different. As it turned out, many watched the same brainless sitcoms and goofy specials they claimed to have no interest in when they were filling out the survey.

Facebook Executives Reveal New Features For Popular Social Networking Site

Facebook says that the changes won’t impact pages that are “genuinely trying to encourage discussion among their fans.” But how will it distinguish between those pages and the ones that are just posting like-bait? That’s not an easy question to answer, even if you have the click habits of a billion users to study. And Facebook is essentially saying: “We’re not going to pay attention to what you do — we’re going to purify your News Feed for your own good.”

As I’ve tried to point out before, this is part of what makes life a lot harder for Facebook than it is for Twitter. The latter might get complaints about the stream being too noisy, but users know that for the most part, they are seeing the content they choose to see from the users they choose to follow. Not so on Facebook.

Facebook is much more interventionist, because it is trying to create this Platonic ideal of a “digital newspaper” that CEO Mark Zuckerberg seems to have in mind. And so it removes content it thinks might bother you (whether it’s photos of violence in Syria or breastfeeding) and chooses the rest of your content based on secret algorithms that you can only guess at — and ones that content owners criticize for being a bait-and-switch. And that is a much harder job.

Post and photo thumbnails courtesy of Thinkstock / jurgenfr and Thinkstock / Justin Sullivan

You’re subscribed! If you like, you can update your settings

  1. Amanda Schroter Thursday, April 10, 2014

    Thank you for the info!

  2. They should really try fixing the issue about click farms first and leave the business of what people see on their feeds to them. One reason is that their data would be completely skewed when determining how many pages a normal user “likes” so that they may be completely over correcting because they’re trying to fix the question “how do we make the feed relevant and personal when someone likes several thousand pages” when almost no-one but fake click-farm profiles do that. Recently was shown this video when kvetching about how low of a percentage of our business page subscribers are shown our posts organically: http://www.youtube.com/watch?v=oVfHeWTKjag

    Eye opening and we certainly won’t bother continuing to spend to advertise on a broken platform until that issue is fixed.

  3. Personally, I’m 100% in elitist agreement with FB here. I’m even cynically thinking that more than 99.42% of the “ZOMG, this poor kid has cancer, please Like this!!!!!1″ are fakes, but perhaps I’ve been reading Snopes for too long.

    With that said, wouldn’t hyper-personalization solve most of these issues? If my friends want to see “This Panda Wants Likes… Please Forward To All Your Friends Or He Will Eat You!!!!1″ posts, let ‘em see it but keep the post out of MY newsfeed.

    Then again, I’m guessing that then we’d start seeing whining about a filter bubble. You know, I actually work for a FB competitor, but sometimes I really feel bad for those folks; no matter what they do, they’re gonna piss off some vocal constituency.

Comments have been disabled for this post