Do We Have Too Many Filters, Or Not Enough?

27 Comments

With all the real-time information networks and publishing tools and multimedia devices we have at our disposal today, many of us have difficulty even managing to keep our heads above the rising tide of information we’re all subjected to. Media guru Clay Shirky once famously said the problem of the modern age isn’t information overload at all, it’s “filter failure” — and many companies and services have been built to help with that problem. But author Eli Pariser is afraid too many filters could be a cure worse than the disease.

Pariser, the president of the board of social agency MoveOn.org, is also the author of a new book, The Filter Bubble: What the Internet Is Hiding From You. In a recent piece in the New York Times, he described his concern about the rise of personalization and the kinds of filters that search engines such as Google (s goog) and Microsoft’s Bing (s msft) are adding to their services, as well as the recommendation tools publishers such as the New York Times (s nyt) and the Washington Post (s wpo) have added, which suggest stories readers might like based on their previous browsing history.

According to the author, the risk with such personalization and customization tools is that they will create echo chambers in which we only read what we want to read, and therefore only hear the arguments we want to hear. Says Pariser:

Democracy depends on the citizen’s ability to engage with multiple viewpoints; the Internet limits such engagement when it offers up only information that reflects your already established point of view. While it’s sometimes convenient to see only what you want to see, it’s critical at other times that you see things that you don’t.

Pariser argues that we need to be aware of when things are being filtered for us, whether by Google or some other agency, and those filters and tools to change them need to be more obvious so that we can broaden our view if we want to.

[I]f algorithms are taking over the editing function and determining what we see, we need to make sure they weigh variables beyond a narrow “relevance.” They need to show us Afghanistan and Libya as well as Apple (s aapl) and Kanye… [and] we citizens need to uphold our end, too — developing the “filter literacy” needed to use these tools well and demanding content that broadens our horizons even when it’s uncomfortable.

Are we really in danger of having too many filters, too much customization of what we see and hear? I can see Pariser’s point about filters potentially leaving us “sealed off in our own personalized online worlds,” but I still think we have too few filters rather than too many. And most of the social filtering that Google and Facebook are building into their services, which uses social cues to highlight content, strikes me as a fairly good balance between filtering and cutting ourselves off from the world — although I am a fan of human curation and filtering rather than that done by algorithms.

Will there be people who have such a uniform social graph that any form of social filtering will just allow them to live in an online echo chamber? Of course there will be — but then, those people already exist, and seem to have no trouble living in a cocoon with or without the Internet. Social filters aren’t going to make that phenomenon any worse (J.P. Rangaswami has a very thoughtful post about filtering, and business blogger Tim Kastelle also wrote a great post recently about the virtues of different kinds of filtering).

A related risk is what some call the “serendipity problem” — namely, the loss of those surprising or unusual links, stories and other content that we might not know in advance we want to read or listen to or watch, but that provide interesting information when we come across them. The traditional newspaper is seen by many as the quintessential serendipity engine, since it is produced by others with little knowledge of what we might want to read, and therefore at least some of what appears in it is a surprise.

The web and digital tools allow us to know exactly what people are reading or looking at, and make it easy to tailor content to those desires in a way that newspapers and traditional media sources never could. Services like Zite and News360 and Flipboard customize what we read and see based on our social networks and our reading habits, and therefore Pariser might see them as a danger as well. But even there, some services — such as the iPad app (s aapl) News.me, which Betaworks recently launched — inject some serendipity into things by allowing you to effectively browse someone else’s social stream. Twitter recently reintroduced a similar feature that lets you look at someone else’s timeline.

To me, the dangers of information overload outweigh the dangers of filter overload or hyper-customization. If the tide of data and content becomes too onerous for people to consume, they will simply tune out and stop paying attention altogether — and that is arguably even worse than the “personalized worlds” problem Pariser describes. At least with recommendations and social filtering, we have some hope of containing the flood and making sense of it in some way.

Post and thumbnail photos courtesy of Flickr users http://www.flickr.com/photos/27983776@N03/3398088456/ and Retinafunk

27 Comments

Tim Kastelle

Thanks for the link to my post Mathew – I appreciate it!

It is a significant issue. I agree with you in that I also like to have some non-algorithmic filtering done too. Getting the right balance though between signal and noise is a definite challenge these days.

Brian Delaney

Great post. The problem right now is people are thinking too broad. They only imagine one huge set of rules which filters all of their content. In the future web, filtering will be based on a wide range of factors and is going to make information discovery an incredible experience. The key is determining the semantic relationship between the data, apps, and people related to each particular topic. We have to stop thinking “Facebook” and starting thinking “web.”

Considering for a moment that we aren’t able to determine semantics, and all info is sent through one huge filter, big issues like Libya and Egypt will still reach the masses. Think of it like a wave in the ocean. You are a user: sitting there, treading water, seeing the same ole horizon day in and day out, then every once in a while a big wave will come rushing by and you won’t be able to avoid it. The thought that you will be completely isolated and impervious to the big news of the day because of filters is naive and short sighted.

I’m working on a not-for-profit crowd-sourced API to link all data online, allowing entrepreneurs and developers to create apps which will provide the type of granular discovery we need to make filters truly valuable: http://PalmettoAPI.com

Simon Gough

We need filters to find more of what we already like. Others need filters to tell us what we want to hear. Somewhere in there is an argument for embracing serendipity.

Filters are essential for making sense of the world’s content. It’s also essential that we don’t become victims of them.

mark evertz

Matthew,
Great post. And Arnold’s comment out of the chute hits the nail on the head as far as I’m concerned. Context. Who are we talking about? The wayward voter in need of a more holistic picture on issues? Yes…that person should not just live in a filter bubble. But I want a doctor treating my rare cancer (theoretically…I’m fine!)to live in a filter bubble 24/7. Heck, I want him to be encased in a “Filter Cocoon” so undeterred by noise and distraction and mired in the latest advancements that one day he plucks the answer out of the ether. A post by the CEO of Attensa gives a nod to your post here, and teases out “Context” in more detail, here.
http://www.attensa.com/2011/06/01/the-current-content-filter-debate-lacks-context/
Great work and keep it up.
Mark

Ann Holman

We have no choice, we need to filter as the noise is getting louder and louder. The browse and serendipity will come from shared links.

Ann Holman

We have no choice, we need to filter as the noise as the noise is getting louder and louder. The browse and serendipity will come from shared links.

William Mougayar

Hi Mathew,
That was an interesting and I commented on it on the NY Times site. Doing filtering right is not easy; it’s hard work (for the provider) for a number of reasons I won’t get into.
But I think there could be a nice balance between robust filtering and serendipity. They are not mutually exclusive. There is targeted information (when I’m seeking something specific), and there’s serendipitous discovery (show me something around my interests that is relevant). We have been addressing this problem at Eqentia, and I confidently tell you that we’re probably the only platform today that offers a blend of targeted filtering, mixed with serendipitous discovery both from social and online media.
It’s lazy of us to say that filtering is not needed. Following people is making us lazy because it’s so easy at the surface. But following people brings the good with the bad, the interesting with the not-so-interesting, the popular with the relevant, and we end-up spending our time going through it and there is no more time left to get the right stuff flowing. Twitter and social media should be a complement, not a replacement to getting the daily flow of info, at least if you’re a professional person with a specific goal in mind.

Lucien Burm

Agree with other comments that this problem has been around for a long time and very much known within the ‘filter’ community. Also, stuff changes but people stay the same, so surely agree with that some (information) poor people already live in bubbles on and offline ;-)

Another trait of people is that they want to play and there is not a lot of play with filters right now. I predict (disclaimer: not without a certain stake in that) that filters become a tool to play with intuitively. Pushing against it to see instant changes and tune as you go.

But at the core, i guess the Serendipity Problem will mainly be solved through some very smart social network filtering (some more disclaiming here).

Let’s not forget that all the information exist between humans. It is through which we connect actually. Consequently it is in those relations where some extra intelligence (‘filters’) will grow and be of real service.

It might be the end of engines as we know it though.

juepucta

Hence the importance of “curatorial responsibility” Because i have no problem with POV and slant (and do have one with the recent fiction of impartiality), but good media outlets are able to tell the difference between opinion and FACTS. Right now, especially in the US, your info is incomplete at best – utterly WRONG at worst.

I am all for customization and filters but i make an effort to, for example, try and understand the ‘opposing side’. But that feedback loop is what makes people think they can say any asinine thing because they are entitled to their own opinion.

-G.

Vicki Kunkel

(My last post apparently didn’t post, so I am re-writing. My apologies if this somehow ends up getting posted twice.)

As content creators, news organizations, and publisher become more savvy to online information consumption, I think there will be a natural “creativity filter.” Relevancy is only part of what draws someone in to view or read content; engagement, entertainment, and an emotional connection are others. The best storytellers–the ones who can engage the emotions of the audience while at the same time imparting important information–are the ones who will have their content viewed and shared by audiences. So there will be sort of a “natural selection” of content: the best will get found, get read, and get shared, while the less interesting (even if relevant) content will fall by the wayside fairly quickly.

To answer your question directly: No, I do not think there are too many filters now. Because of the information we are bombarded with from myriad platforms such as mobile phones, computers, tablets, newspapers and TV screens, we have a wealth of information and a poverty of attention; we simply can’t absorb and process it all.

Vicki Kunkel

As publishers, content creators and news organizations become more savvy to online consumption (relevancy is only part of the equation; people are also drawn to content that is engaging and entertaining), I think there will begin to be a “creativity filter”–people will gravitate toward content on a particular topic that “grabs” them emotionally, or entertains them, as well as informs them.

But, to address the original questions: I do not think we have too many filters. We have a wealth of information, and a poverty of attention. There’s simply too much information coming at us from myriad platforms (mobile, computers, tablets, TV, newspapers) that our minds are simply not equipped to handle it all. When our minds become overloaded with information, they shut down and engage in nothing.

Leigh

I don’t think it’s an issue from the perspective that the most engaged political discussions I have had have been through Facebook. Differently then pple i spend significant “in person” time with, my social network is rather diverse from anarchist to neocon. And on top of that, it isn’t personalization that filters my news media, it’s my own choices and personal biases of what sources i read (which have their own particular biases 100% of the time).

But it’s only through my social network that I get to see articles from the diverse cast of characters i know. Perfect example was Mark Dubowitz (who i know from high school and is Executive Director of the Foundation for Defense of Democracies) – He and I had a lively debate about Obama’s recent positions on Israel. His network and mine got to see and read all that content that they would normally never actively seek out on their own.

ronald

I think we first would have to distinguish between personal and centralized filters (Google, Bing …). Centralized filters generate statistical models based on past/expected behavior of large groups. They are more like a funnel. Personalized filters to one extreme is context, something one has learned and organized at different times than most anybody else. Therefore context is a very diversified filtering system, even if a group of people share a traumatic event the filter applied will be different but there is a shared context for future references between those people.
To explain recognize a different view point one needs to understand the [shared] context of the other argument. Then there’s something like dopamine, but that gets us really into the woods.

Anyhow just saying filter and not distinguishing between filters is starting with the wrong assumptions. As soon as one excepts that current search is incredible primitive when compared with a self organizing contextual system the arguments change. Because there is this little thing … “self” in it.

In other words you are probable right in your assumption about human curation, but make no mistake we can and will do this with machines.

Paul Ciano

These are not really new concerns, are they? These arguments have been made before and it’s still irritating that they are couched in either/or terms. The key is choice. The consumer should have the option to choose between a personalized and an unfiltered information stream.

Mathew Ingram

I think you’re right, Paul — these kinds of concerns have been around for awhile now, and choice by the user is key. Thanks for the comment.

Nancy Garcia

“Will there be people who have such a uniform social graph that any form of social filtering will just allow them to live in an online echo chamber? Of course there will be — but then, those people already exist, and seem to have no trouble living in a cocoon with or without the Internet.”

That’s the main point, I feel, in all of this.

The same way people move into neighborhoods they feel most comfortable in, to live amongst people that look or live like them, there are people who will always navigate towards and click on the information that validates their thinking or point of view.

And I agree with Noorin that “the antedote to an over-filtered online experience is found through shared links.”

gregorylent

mystics know the answer to either side of this (spurious) “problem” ….. pay attention to the attending self, and not to the stuff, and the right thing comes at the right time, giving exactly what is needed, when needed.

Noorin Ladhani

I agree with your conclusion. I acknowledge that a too personalized web experience may create information silos however I believe the antedote to an over-filtered online experience is found through shared links. Unlike traditional print media the internet provides content producers the opportunity to share their resources through embedded article links and share the content they consume through social media. In this context the link economy becomes even more relevent as internet users are not only valued for the content they produce but for their ability to curate and share valuable informtion online.

Beth Agnew

As long as we are in control of the filters, it’s fine. In this Connected Age we need to develop personal ways of handling information overload — it’s a required skill. When we hand it off to others or automation, we’re letting others tell us what we should be paying attention to. While that does help reduce the tsunami of info washing over us at any given moment, it gives us a very narrow view, especially when there are times we need to see the big picture and greater context. Recommendations, sure, but letting a service curate everything for me? No thanks. I still enjoy the serendipity of something unexpected.

Arnold Waldstein

Mathew

When I first saw the headline I though you were being rhetorical.

Filters are the antidote to too much noise and chatter and unstructured uncontextual info. Is there really another way?

Seems like you get to the same spot I did in my post:

Choosing context over friendship to filter the social web @ http://t.co/yNFhj01

Comments are closed.