Summary:

Are algorithms actually making society dumber? Yes, says at least one big data expert. We can’t throw computers at our problems until we better define those problems though human input.

Structure Data 2013 Eric Berlow Vibrant Data Labs
photo: Albert Chau


Session Name: Augmenting Algorithms With Human

Speakers:

S1 Announcer S2 Eric Berlow

ANNOUNCER 00:04

Up next we have Mr. Eric Berlow, hes the founder of Vibrant Data Labs, and hes going to be talking about augmenting algorithms with human input. Please welcome Eric Berlow to the stage. [applause]

ERIC BERLOW 00:20

Can you guys hear me? All right. So, I am fascinated by our growing relationship with algorithms. Im super excited about what could be, but Im also deeply concerned about what happens if we dont pay attention. In theory, algorithms augment our intelligence and then machines learn from us. This goes round and round, and we all get smarter and super human, and everything is great. But the problem is, if we dont pay attention, it actually can make us dumber. Let me just give you a couple, really simplistic examples to illustrate that.

ERIC BERLOW 00:59

When I read the newspaper, theres a lot of content. Im kind of overwhelmed, so its natural to use some algorithmic filters to help simply my tasks. I might look over on the right and see a most emailed list; I click on a few of those. After a few weeks, I start wondering where did all the news go? All Im reading are stories of personal improvement, personal health, personal finance, other personal stories and sex scandals. Thats the news that Im reading. If I go to the most recommended list, its more of the same because its tracking what Ive been reading, because Ive been herded like sheep to whats most popular. In the end, I do a lot better if I just read the paper newspaper than I do if Im reading it online and using the algorithmic help.

ERIC BERLOW 01:43

And it gets even worse when we use our algorithmic filters to focus on whats trending now. For example, last fall during the presidential debates, the algorithms were all over this trying to take the pulse of the nation in real time. If you followed and thats all you followed, youd think that the only important things were Big Bird and binders full of women, because that was the two top trending hash tags, right? This just frustrates me so much, because think about it. Weve got all this data. Weve got all this computation power and sophisticated algorithms to see the big picture, because thats really what algorithms are good for is seeing the big-picture view that you couldnt see otherwise and put things in context.

ERIC BERLOW 02:28

Then, we end up with this obsession with this focus on the minutiae of whats trending now. I think something is seriously broken, and its making us dumber as a society. So, this relationship that we have is not really, necessarily, making us smarter. Instead of augmented intelligence, weve got augmented myopia with dumb learning from amplifying crowd behavior, right? How can we take that and flip that to be more interesting for us, to augment our humanity or society? I think part of that is to take advantage, a little bit of what a couple hundred million years of evolution has made humans kind of uniquely good at, take advantage of that. How do we leverage things like our ability to be creative, our intuition, our ability to have causal inference, to have sensitivity to nuance and context, things that computers arent so good at?

ERIC BERLOW 03:21

How could we essentially harness and amplify our collective creativity as a society to solve really big problems? I think thats really our challenge, right? Sometimes we a lump a lot of these crowdwork behaviors into the crowdsourcing bucket. Recently, Missy Udall proposed a nice framework for understanding the nuances of that. Lets say we want to get some work done by a bunch of people, and were trying to get inputs. We can either ask people to do easy things or hard things. The aggregated collection of all that could either be just the simple sum of the parts, or it could be greater than the sum of the parts. For example, crowdfunding is something where people do simple, easy things by giving little bits of money. Then in the end, you get the sum of that – nothing more, nothing less.

ERIC BERLOW 04:10

Photo City is a game where people take snapshots of buildings and cities, and then that gets all gathered together and the algorithms put it together into a 3D model of the building and a 3D model of the city. People are doing simple things, but the output is more than the sum of the parts. Wikipedia is a great example of a crowdwork where its obviously, cognitively demanding than just taking snapshots. But, the aggregated sum of that is really the sum of our existing knowledge. Nothing new and novel comes out of it. Actually, the explicit mission of Wikipedia is not to generate new knowledge, like new research, but to just sum up what we already know.

ERIC BERLOW 04:49

Finally, theres things like Folded and CrowdForge. CrowdForge is where people crowdsource solutions to complex tasks by breaking apart and divvying it out and putting it back together. People arent doing difficult work, but then it adds up to something that you couldnt from the pieces alone. Its that corner that really particularly interests me, which we call collective creativity. Where we end up with something thats more than the sum of the parts, and take advantage of what humans are really good at. All these other things are awesome examples, too, of really amazing work. I actually think that these quadrants should be even way more populated and especially the upper-right one.

ERIC BERLOW 05:30

If were going to focus on creativity, what is that creative process? Researchers usually divide the creative process into a few, key stages. There is how do we define the right problem to work on? Once we have, how do we ideate around it to generate novel, potential solutions? Then, how do we evaluate which ones to move forward with? I think for really large societal problems, the biggest challenge right now is just problem definition. Of just taking something really big amorphous and defining really whats the problem here? Because once we do that, we have some pretty good large-scale societal mechanisms for generating lots of possible solutions, and then evaluating which ones might be good.

ERIC BERLOW 06:07

I know thats pretty abstract, so let me just give you one example of a project we worked on. We were trying to blend human expertise with algorithms to find patterns we couldnt see otherwise, to help define big problem that affects us all. And thats a problem around our personal data. Let me just step back and give you some context. This is my dad, Sheldon, and he just turned eighty. He just, last year, got his first Smart phone and was learning, really quickly, how to map and find the fastest route to the best coffee shop in Buffalo, New York. It just blew me away, watching him do this. He was analyzing layers of spatial data in a way that ten years ago, required a graduate degree in geographic information systems. I was like, Dad, dont you realize what youre doing? He has no idea that hes even doing it, which is really remarkable. At the same time, its also kind of creepy, because now, for the first in his life, hes aware that his every query, his every movement is being tracked like a radio tagged buffalo. Hes aware now that his data, that hes the customer, but his data are the product. We all know that, but now, people even like my dad are aware of that, too. It kind of creeps him out. I think the thing is, for most of us it just means we get subjected to more targeted ads or niche marketing and that kind of thing.

ERIC BERLOW 07:23

Not everybodys so lucky. When we were doing this project, I got an email from a citizen journalist in Mexico. Her name is Patricia. Well call her Patricia. She was using social media to gather citizen reports in an area where drug violence was silencing the media. She said, Look, were gathering reports and were flagging risk areas to save lives. But at the same time, the citizen journalists who are out there, their personal data trail that they leave behind is putting some of them in prison and leaving others, literally, dead in the street. She was asking for help.

ERIC BERLOW 07:56

I think this is really the grand data challenge of our time, which is how can we democratize data, how can we catalyze a new, personal data economy that actually works for people and not against them? The World Economic Forum just came out with a report on personal data. Its like the new gold, the new asset class. The question is, can we do more than targeted ads and niche marketing and surveillance? Theres so much potential and yet people are also really anxious. So, when I look at problems like this, Im an ecologist by training, and this is a green, tide pool ecosystem that I studied for my PhD. Every color here is a species, and theyre all interconnected in this really complex web of life, which is what attracted me to ecology, that complexity.

ERIC BERLOW 08:46

Problems like democratizing data and other big problems, I see them like ecosystems made up of a lot of little problems that are all inter-connected. You solve one, you cause another or you make another one better. How can we take a big problem like this and break it up into all its component species and see how they all interact, who influences whom? To do that, we launched a project called, We the Data, about our personal data. This was a collaborative project with. I collaborated with a group at Intel Labs, a group at BrainWise, David Gurman who is a TED Fellow, an artist, and a designer. Emily Aiken helped us a lot with storytelling, and Julia Powell was our expert interviewer, she was awesome. She is actually curating content now for the site that emerged from this research project.

ERIC BERLOW 09:32

The question was how do we break this up? We went around and interviewed a bunch of experts, ranging from tech luminaries like Bill Joy and John Batali, to young, social entrepreneurs working on data for social good. We just asked them, What do you think are the biggest problems to solve? What are the challenges? From those interviews, we busted up this problem into about ninety smaller problems. Each of these dots is a problem, and they were all considered really important by some expert in the field. They spanned a whole lifecycle of data from how data are created, to how we store and organize data, to how we access and share data, how we turn data into meaning, how we turn that meaning into action, how we derive value from data, and ultimately, how we nourish higher goals of human well-being.

ERIC BERLOW 10:17

The problem is that all those problems are linked. If you empower a citizen to document violence in the streets, you also put them at risk of surveillance. How do we map that complexity? So, we created an online tool to have a bunch of people get in and vote, not on whats the most popular problem, but what are the consequences? If one problem is solved, does another problem get better or worse? Then, thousands of votes later, we ended up with, I think, is the first crowdmap network structure of a complex problem. Its amazing, so think about it. Each of these nodes is a human brain and expert saying this is a critical challenge. The lines are human brains saying if this problem is solved, I think it has a really strong influence on this other one. Its literally like a collective nervous system of experts. It wasnt a huge crowd, it was a focal group of experts. Its a map of their collective expert understanding of the problem.

ERIC BERLOW 11:15

Now, I know this looks like a hairball, but this is where we use algorithms to help us see things that we couldnt see otherwise. So, we could sort all those problems by ones that is solved, solve many but are influenced by few. Then we could sort them again by ones where the outgoing influences are stronger than the incoming ones. And then remember, each of these dots is a problem posed by an expert, so the ones in yellow in the upper-right, could be considered undernourished catalysts that have potential broad reach. Because theyre weakly influenced by few other problems, but if solved, they strongly influence many. For example, if Im a citizen witnessing violence in the streets, if I know I can protect my personal identity, Im more likely to document and share that event, even its location, and then, we might also be able to catalyze a bigger community to do that, because they trust in the system.

ERIC BERLOW 12:08

Now we can focus in on those potential catalysts with broad reach, and zoom in on them. And again, let the algorithms go to work and recolor and sort these by problems that are more connected to one another. From the original ninety problems, four grand areas emerge. Weve got digital access, which is just the basic technical infrastructure for access to the underserved. Clearly, a really necessary foundation, but its not sufficient so people wont participate if they cant trust. People like Patricia and her colleagues need to trust that they can control their personal data exhaust. Then just like my dad, is intuitively answering questions and data without even knowing it, Patricia needs to be able to map trends of violence and then predict hotspots of risk before they happen.

ERIC BERLOW 12:56

Then finally, the other thing that popped up was the importance of openness to copy and modify. Patricia was essentially asking for some kind of Word press for citizen journalism that she could use, to tailor it to her own situation in her own language, and then maybe in the process, improve it. These emerged from the community, and theyre obviously still broad. But the idea was to put some bounds on these huge problem, because the problem with big problems is that theyre so big we dont know where to begin, or we break them down into little problems and its so complex, we dont know where to begin. I think this actually scares a lot of really creative people away.

ERIC BERLOW 13:33

Creativity needs constraints and I think we can actually get that from the architecture of the problems themselves. Its not about creating a wiring diagram to model this dynamically and engineer solutions, its really about mapping collective expert understanding of the problem, and then using the architecture that emerges to help bound the problem and focus on what might matter most. Whats the interesting problem to solve? This may seem kind of obvious to you in retrospect, but it wasnt obvious at the beginning and I always say gravity was obvious in retrospect, too. Youve got to think that after we posted this online, Bill Hoffman from the World Economic Forum called me and he said, Wow, you know, weve been working on this problem about the personal data economy for a couple of years now with a global group of experts. And, all these experts arethe problem is so complex, that people are talking at cross purposes, nobody can agree how to move forward, and you guys totally nailed it. You showed that yes, this is complex. Lets dive in. But, weve defined it. We know where to begin, lets do something. He found it so valuable, that we created together a short data video to show Odavos in January, so we could help define this global agenda and focus the conversation on what mattered most.

ERIC BERLOW 14:55

The idea here again, is that a problem well-defined is a problem well-solved. Then we know where to focus energy on creative innovation. This is an example of where we tried to take input from humans, expert input that required, really, some cognitive work and then combine that with algorithms in a way to find patterns we couldnt have seen before. Hopefully, that makes us all a little bit smarter. For example, weve got this picture of you, if we were investing in this new personal data economy. If we had a myopic view, we would think lets just invest in something thats immediate profitable, invest in a business. But now, we can actually think about investing in a whole new economic ecosystem. For example, maybe investing in the ability for everyday people to anony-mize their identity is not really immediately profitable, because all the money is to be made in owning peoples personal data. But, if you solved the problem of trust, now weve got this big-picture view. We can see that it may catalyze a whole new economic ecosystem of sharing that wasnt there before, and could be incredibly profitable.

ERIC BERLOW 16:00

Maybe it makes individual citizens smarter, too. Theres some really interesting work in cognitive psychology by Steve Sloan and others, that if you ask people to really unpack the complexity of a problem, to dive in and articulate it, people with really extreme views actually get much more moderate. Were looking to build out the tool now for broader citizen engagement and imagine if for example, people would vote on initiatives, not by voting on whats the most popular, but by voting on consequences. If this thing passed, what would it influence and compile all that collective information. I think we can go bigger and really start to map out larger problems. Heres a couple additional prototypes we did with smaller groups of people on middle eastern conflict and the future of energy.

ERIC BERLOW 16:48

Any time we do this, the patterns are not random, the complexity is not random, theres an architecture to it. Some of our work in ecological systems suggest that the bigger the network, the more predictable it is or the more we can glean about the dynamics, just from static attributes of the structure alone. This is a really big topic in network theory now, trying to learn what we can and we cant glean from the architecture of a network without modeling all the dynamics and all the error propagation issues that brings up. Theres some really interesting work coming out of that. You see it in nature and science all the time. I think we are, like it or not, in this long-term, intimate relationship with algorithms. The thing is, it requires work. Its not inherently on its own, like any good relationship, right? It requires work, its not inherently on its own devices, going to make us smarter. I think it has potential to if we really nourish and actively, consciously think about what were doing. If everything that we do, when were developing new tools, if we step back and constantly ask ourselves, is this really making us better than the sum of our parts? Is this making us a more collectively creative society? If we do that, we can nudge this relationship in the right direction, and I think, accomplish some really big things that weve never done before. Thank you. [applause]

firstpage of 2

Comments have been disabled for this post