With all the real-time information networks and publishing tools and multimedia devices we have at our disposal today, many of us have difficulty even managing to keep our heads above the rising tide of information we’re all subjected to. Media guru Clay Shirky once famously said the problem of the modern age isn’t information overload at all, it’s “filter failure” — and many companies and services have been built to help with that problem. But author Eli Pariser is afraid too many filters could be a cure worse than the disease.
Pariser, the president of the board of social agency MoveOn.org, is also the author of a new book, The Filter Bubble: What the Internet Is Hiding From You. In a recent piece in the New York Times, he described his concern about the rise of personalization and the kinds of filters that search engines such as Google and Microsoft’s Bing are adding to their services, as well as the recommendation tools publishers such as the New York Times and the Washington Post have added, which suggest stories readers might like based on their previous browsing history.
According to the author, the risk with such personalization and customization tools is that they will create echo chambers in which we only read what we want to read, and therefore only hear the arguments we want to hear. Says Pariser:
Democracy depends on the citizen’s ability to engage with multiple viewpoints; the Internet limits such engagement when it offers up only information that reflects your already established point of view. While it’s sometimes convenient to see only what you want to see, it’s critical at other times that you see things that you don’t.
Pariser argues that we need to be aware of when things are being filtered for us, whether by Google or some other agency, and those filters and tools to change them need to be more obvious so that we can broaden our view if we want to.
[I]f algorithms are taking over the editing function and determining what we see, we need to make sure they weigh variables beyond a narrow “relevance.” They need to show us Afghanistan and Libya as well as Apple and Kanye… [and] we citizens need to uphold our end, too — developing the “filter literacy” needed to use these tools well and demanding content that broadens our horizons even when it’s uncomfortable.
Are we really in danger of having too many filters, too much customization of what we see and hear? I can see Pariser’s point about filters potentially leaving us “sealed off in our own personalized online worlds,” but I still think we have too few filters rather than too many. And most of the social filtering that Google and Facebook are building into their services, which uses social cues to highlight content, strikes me as a fairly good balance between filtering and cutting ourselves off from the world — although I am a fan of human curation and filtering rather than that done by algorithms.
Will there be people who have such a uniform social graph that any form of social filtering will just allow them to live in an online echo chamber? Of course there will be — but then, those people already exist, and seem to have no trouble living in a cocoon with or without the Internet. Social filters aren’t going to make that phenomenon any worse (J.P. Rangaswami has a very thoughtful post about filtering, and business blogger Tim Kastelle also wrote a great post recently about the virtues of different kinds of filtering).
A related risk is what some call the “serendipity problem” — namely, the loss of those surprising or unusual links, stories and other content that we might not know in advance we want to read or listen to or watch, but that provide interesting information when we come across them. The traditional newspaper is seen by many as the quintessential serendipity engine, since it is produced by others with little knowledge of what we might want to read, and therefore at least some of what appears in it is a surprise.
The web and digital tools allow us to know exactly what people are reading or looking at, and make it easy to tailor content to those desires in a way that newspapers and traditional media sources never could. Services like Zite and News360 and Flipboard customize what we read and see based on our social networks and our reading habits, and therefore Pariser might see them as a danger as well. But even there, some services — such as the iPad app News.me, which Betaworks recently launched — inject some serendipity into things by allowing you to effectively browse someone else’s social stream. Twitter recently reintroduced a similar feature that lets you look at someone else’s timeline.
To me, the dangers of information overload outweigh the dangers of filter overload or hyper-customization. If the tide of data and content becomes too onerous for people to consume, they will simply tune out and stop paying attention altogether — and that is arguably even worse than the “personalized worlds” problem Pariser describes. At least with recommendations and social filtering, we have some hope of containing the flood and making sense of it in some way.
Post and thumbnail photos courtesy of Flickr users http://www.flickr.com/photos/27983776@N03/3398088456/ and Retinafunk