FOX News and Prismatic might have more in common than meets the eye. From politics to products, our innate biases affect the way we view the information with which we’re presented, which means anyone trying to spread a message or effect change via content must do more than just crunch some data.
Aiming to figure out why America is becoming more politically polarized despite traditional beliefs that societies naturally move toward the middle, a group of Stanford researchers considered how our natural biases affect the way we interpret information. What they found is that people tend to view the world through red- or blue-colored glasses: when we see inconclusive information, we intepret it in ways that support our natural political biases and ignore the aspects that don’t. So if you show the exact same piece of inconclusive information to a group of people, it will likely lead to more polarization rather than to general consensus on the meaning.
It turns out, this phenomenon extends beyond clearly biased media such as FOX or MSNBC and into more objective content sources on the web. When the researchers applied their model to online recommendation engines, they found that pieces of content most-relevant to users are “always polarizing,” whereas pieces of information that are merely similar to something someone already likes are only polarizing if the person is already biased. In short: While they’re able to ignore or at least view objectively less-important stuff, even pretty middle-of-the-road people will take a hard stance on stuff that matters to them.
Of course, how one reacts to research like this largely depends on what one is trying to accomplish. The researchers involved appear to be all about moving people toward the middle on some issues, which is why they created a federal-budget app called Widescope that lets people configure their own budgets and then shows them the similarities with the various budget proposals floating around Washington, D.C. They’ve also looked into creating social systems that counteract polarization by using trusted information sources (a press release explaining the research suggests Rush Limbaugh or Rachel Maddow) to present information that biased individuals might otherwise be inclined to dismiss.
Applied generally to the web, this approach might help mitigate some of the effects of the hyper-personalized experience that’s now possible. You know, the kind of thing that happens when you fill up RSS readers with sources you like, follow like-minded people on Twitter, and sign up for services that use machine learning to surface even more of the same content based on that homogeneous reading activity. Or when you keep searching for the same stuff on Amazon(s amzn) or viewing the same types of movies on Netflix(s nflx).
Services that go beyond “injecting serendipity” into their content feeds could actually try to broaden users’ minds by surfacing content that’s in some ways very different or counterintutive to what a simple interest graph might show. I’m not sure how this would look algorithmically, but I’m envisioning, for example, a semi-regular insertion of content from sources or genres considered the opposite of a readers’ norms but that touch upon topics they’re interested in. Or vice versa.
I genuinely believe most web startups trying to tackle the problem of content curation want to be helpful as possible, are aware of issues such as biased assimilation and are at least considering methods for counteracting it in order to give users a broader view beyond just what those users think they want to see.
On the other hand, if you wanna lock people into their current beliefs or their current content-consumption habits, that’s probably a lot easier to do. And sadly, for some politicians and special interest groups, that probably suits them just fine.
Feature image courtesy of Shutterstock user Kutlayev Dmitry.