The internet practically runs on the recommendations of others. Whether it’s finding the right rug for your living room, the best Chinese restaurant in town or the smartest take on a news article, users are naturally more likely to seek out things that other people like.
But what if the best wasn’t really the best, just the result of some clever manipulation?
MIT professor Sinan Aral and his team of researchers created an experiment to test how much a single positive or negative review swayed an entire group of commenters — a follow-up to his 2011 study on the impact of crowdsourced movie reviews on others’ opinions. What the team found indicated that we’re all a little more like lemmings than we care to admit.
The experiment used a news aggregate site “similar to Reddit” (the team won’t give out the name of the network used) that tracked responses to upvotes or downvotes. Over the course of five months, and with 100,000 posts, the researchers gave every post an immediate and randomized upvote or downvote as the first comment — with a “no vote” as a control.
The results showed that those single “seed” votes have the potential to influence what comes after them. Articles that received the upvote were 32% more likely to receive positive ratings, and saw a 25% increase in upvotes on average compared to the control.
Downvotes didn’t trigger the same sheep-like response: In fact, ratings with a downvote were actually likely to receive a corresponding upvote.
Thing about that the next time someome leaves a gushy review of a book or movie or pair of pants.