27 Comments

Summary:

Media guru Clay Shirky once famously said that the problem of the modern age isn’t information overload at all, it’s “filter failure” — and many new services have been built to help with that. But Eli Pariser says the cure could be worse than the disease.

3398088456_58b12e52e3_z

With all the real-time information networks and publishing tools and multimedia devices we have at our disposal today, many of us have difficulty even managing to keep our heads above the rising tide of information we’re all subjected to. Media guru Clay Shirky once famously said the problem of the modern age isn’t information overload at all, it’s “filter failure” — and many companies and services have been built to help with that problem. But author Eli Pariser is afraid too many filters could be a cure worse than the disease.

Pariser, the president of the board of social agency MoveOn.org, is also the author of a new book, The Filter Bubble: What the Internet Is Hiding From You. In a recent piece in the New York Times, he described his concern about the rise of personalization and the kinds of filters that search engines such as Google and Microsoft’s Bing are adding to their services, as well as the recommendation tools publishers such as the New York Times and the Washington Post have added, which suggest stories readers might like based on their previous browsing history.

According to the author, the risk with such personalization and customization tools is that they will create echo chambers in which we only read what we want to read, and therefore only hear the arguments we want to hear. Says Pariser:

Democracy depends on the citizen’s ability to engage with multiple viewpoints; the Internet limits such engagement when it offers up only information that reflects your already established point of view. While it’s sometimes convenient to see only what you want to see, it’s critical at other times that you see things that you don’t.

Pariser argues that we need to be aware of when things are being filtered for us, whether by Google or some other agency, and those filters and tools to change them need to be more obvious so that we can broaden our view if we want to.

[I]f algorithms are taking over the editing function and determining what we see, we need to make sure they weigh variables beyond a narrow “relevance.” They need to show us Afghanistan and Libya as well as Apple and Kanye… [and] we citizens need to uphold our end, too — developing the “filter literacy” needed to use these tools well and demanding content that broadens our horizons even when it’s uncomfortable.

Are we really in danger of having too many filters, too much customization of what we see and hear? I can see Pariser’s point about filters potentially leaving us “sealed off in our own personalized online worlds,” but I still think we have too few filters rather than too many. And most of the social filtering that Google and Facebook are building into their services, which uses social cues to highlight content, strikes me as a fairly good balance between filtering and cutting ourselves off from the world — although I am a fan of human curation and filtering rather than that done by algorithms.

Will there be people who have such a uniform social graph that any form of social filtering will just allow them to live in an online echo chamber? Of course there will be — but then, those people already exist, and seem to have no trouble living in a cocoon with or without the Internet. Social filters aren’t going to make that phenomenon any worse (J.P. Rangaswami has a very thoughtful post about filtering, and business blogger Tim Kastelle also wrote a great post recently about the virtues of different kinds of filtering).

A related risk is what some call the “serendipity problem” — namely, the loss of those surprising or unusual links, stories and other content that we might not know in advance we want to read or listen to or watch, but that provide interesting information when we come across them. The traditional newspaper is seen by many as the quintessential serendipity engine, since it is produced by others with little knowledge of what we might want to read, and therefore at least some of what appears in it is a surprise.

The web and digital tools allow us to know exactly what people are reading or looking at, and make it easy to tailor content to those desires in a way that newspapers and traditional media sources never could. Services like Zite and News360 and Flipboard customize what we read and see based on our social networks and our reading habits, and therefore Pariser might see them as a danger as well. But even there, some services — such as the iPad app News.me, which Betaworks recently launched — inject some serendipity into things by allowing you to effectively browse someone else’s social stream. Twitter recently reintroduced a similar feature that lets you look at someone else’s timeline.

To me, the dangers of information overload outweigh the dangers of filter overload or hyper-customization. If the tide of data and content becomes too onerous for people to consume, they will simply tune out and stop paying attention altogether — and that is arguably even worse than the “personalized worlds” problem Pariser describes. At least with recommendations and social filtering, we have some hope of containing the flood and making sense of it in some way.

Post and thumbnail photos courtesy of Flickr users http://www.flickr.com/photos/27983776@N03/3398088456/ and Retinafunk

  1. Mathew

    When I first saw the headline I though you were being rhetorical.

    Filters are the antidote to too much noise and chatter and unstructured uncontextual info. Is there really another way?

    Seems like you get to the same spot I did in my post:

    Choosing context over friendship to filter the social web @ http://t.co/yNFhj01

    Share
    1. Thanks, Arnold — good post.

      Share
  2. As long as we are in control of the filters, it’s fine. In this Connected Age we need to develop personal ways of handling information overload — it’s a required skill. When we hand it off to others or automation, we’re letting others tell us what we should be paying attention to. While that does help reduce the tsunami of info washing over us at any given moment, it gives us a very narrow view, especially when there are times we need to see the big picture and greater context. Recommendations, sure, but letting a service curate everything for me? No thanks. I still enjoy the serendipity of something unexpected.

    Share
    1. Good points — thanks for the comment, Beth.

      Share
    2. Beth…couldn’t agree more. Auto filtering and surfacing info based on popularity is just part of the process.

      I dug into this a bit @ http://t.co/S9gCzWV but the idea that we need to have control of our filters is key. Check out http://www.eqentia.com as a filtering curation tool. That’s part of its premise.

      Share
  3. I agree with your conclusion. I acknowledge that a too personalized web experience may create information silos however I believe the antedote to an over-filtered online experience is found through shared links. Unlike traditional print media the internet provides content producers the opportunity to share their resources through embedded article links and share the content they consume through social media. In this context the link economy becomes even more relevent as internet users are not only valued for the content they produce but for their ability to curate and share valuable informtion online.

    Share
  4. gregorylent Tuesday, May 31, 2011

    mystics know the answer to either side of this (spurious) “problem” ….. pay attention to the attending self, and not to the stuff, and the right thing comes at the right time, giving exactly what is needed, when needed.

    Share
  5. “Will there be people who have such a uniform social graph that any form of social filtering will just allow them to live in an online echo chamber? Of course there will be — but then, those people already exist, and seem to have no trouble living in a cocoon with or without the Internet.”

    That’s the main point, I feel, in all of this.

    The same way people move into neighborhoods they feel most comfortable in, to live amongst people that look or live like them, there are people who will always navigate towards and click on the information that validates their thinking or point of view.

    And I agree with Noorin that “the antedote to an over-filtered online experience is found through shared links.”

    Share
    1. Thanks for the comment, Nancy — I agree.

      Share
  6. These are not really new concerns, are they? These arguments have been made before and it’s still irritating that they are couched in either/or terms. The key is choice. The consumer should have the option to choose between a personalized and an unfiltered information stream.

    Share
    1. I think you’re right, Paul — these kinds of concerns have been around for awhile now, and choice by the user is key. Thanks for the comment.

      Share
  7. I think we first would have to distinguish between personal and centralized filters (Google, Bing …). Centralized filters generate statistical models based on past/expected behavior of large groups. They are more like a funnel. Personalized filters to one extreme is context, something one has learned and organized at different times than most anybody else. Therefore context is a very diversified filtering system, even if a group of people share a traumatic event the filter applied will be different but there is a shared context for future references between those people.
    To explain recognize a different view point one needs to understand the [shared] context of the other argument. Then there’s something like dopamine, but that gets us really into the woods.

    Anyhow just saying filter and not distinguishing between filters is starting with the wrong assumptions. As soon as one excepts that current search is incredible primitive when compared with a self organizing contextual system the arguments change. Because there is this little thing … “self” in it.

    In other words you are probable right in your assumption about human curation, but make no mistake we can and will do this with machines.

    Share
  8. I don’t think it’s an issue from the perspective that the most engaged political discussions I have had have been through Facebook. Differently then pple i spend significant “in person” time with, my social network is rather diverse from anarchist to neocon. And on top of that, it isn’t personalization that filters my news media, it’s my own choices and personal biases of what sources i read (which have their own particular biases 100% of the time).

    But it’s only through my social network that I get to see articles from the diverse cast of characters i know. Perfect example was Mark Dubowitz (who i know from high school and is Executive Director of the Foundation for Defense of Democracies) – He and I had a lively debate about Obama’s recent positions on Israel. His network and mine got to see and read all that content that they would normally never actively seek out on their own.

    Share
    1. Thanks for the comment, Leigh.

      Share
  9. Vicki Kunkel Tuesday, May 31, 2011

    As publishers, content creators and news organizations become more savvy to online consumption (relevancy is only part of the equation; people are also drawn to content that is engaging and entertaining), I think there will begin to be a “creativity filter”–people will gravitate toward content on a particular topic that “grabs” them emotionally, or entertains them, as well as informs them.

    But, to address the original questions: I do not think we have too many filters. We have a wealth of information, and a poverty of attention. There’s simply too much information coming at us from myriad platforms (mobile, computers, tablets, TV, newspapers) that our minds are simply not equipped to handle it all. When our minds become overloaded with information, they shut down and engage in nothing.

    Share
  10. Vicki Kunkel Tuesday, May 31, 2011

    (My last post apparently didn’t post, so I am re-writing. My apologies if this somehow ends up getting posted twice.)

    As content creators, news organizations, and publisher become more savvy to online information consumption, I think there will be a natural “creativity filter.” Relevancy is only part of what draws someone in to view or read content; engagement, entertainment, and an emotional connection are others. The best storytellers–the ones who can engage the emotions of the audience while at the same time imparting important information–are the ones who will have their content viewed and shared by audiences. So there will be sort of a “natural selection” of content: the best will get found, get read, and get shared, while the less interesting (even if relevant) content will fall by the wayside fairly quickly.

    To answer your question directly: No, I do not think there are too many filters now. Because of the information we are bombarded with from myriad platforms such as mobile phones, computers, tablets, newspapers and TV screens, we have a wealth of information and a poverty of attention; we simply can’t absorb and process it all.

    Share

Comments have been disabled for this post