9 Comments

Summary:

New research seems to show that our memories are less accurate when we know the information is stored somewhere else. Some feel this is going to make us less human in some way, but I for one am glad to outsource parts of my brain.

Science magazine has published some research into how our memories are influenced by the availability of computers as a source of information, and this has some in a tizzy about the implications of outsourcing our brains. Author Nick Carr, for example — who has written a whole book about how the web is changing the way we think and making us more shallow — says he worries this phenomenon is going to make us less human in some way. But is that really a risk? I don’t think so. I, for one, am glad to outsource the duty of remembering miscellaneous facts to the cloud, because it leaves me free to do more important things.

In a nutshell, the Columbia University psychologists who published the study performed a number of experiments designed to test whether subjects remembered certain things better or worse when they were told that the information — such as “An ostrich’s eye is bigger than its brain” — would be stored in a computer somewhere or would be available through a search engine. Not surprisingly perhaps, people’s memories were somewhat less reliable when they knew the answers they were seeking would be stored for later retrieval (there are more details at the Columbia website).

Implanting forgetfulness?

Carr says he’s worried that by losing these facts and details we store elsewhere, we will become less human in some way, or lose some core of ourselves. But is that really what’s happening? I don’t think so. It’s not like I’m suddenly going to forget my son’s first steps (oh, that’s right — I have daughters!) because I use Google to look up who starred in that movie we watched a couple of years ago, or to figure out who the head of the United Nations is. It’s worth remembering that the invention of writing triggered similar fears, as Plato reminds us in The Phaedrus, quoting the King of Thebes:

If men learn this [writing], it will implant forgetfulness in their souls;they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder.

Carr also makes the argument in his book “The Shallows: What the Internet is Doing to Our Brains” that we are becoming not just dumber as a result of the web, but also (supposedly) less interesting, because our brains are being trained to focus on the ephemeral and the trivial instead of the important things we should be spending time on. I took issue with this kind of fear at the time, as did some others, and I think Carr is being similarly alarmist in this case. Besides, if we use the cloud to remember the trivial and ephemeral for us, wouldn’t that be a good thing by Carr’s definition?

Do we still need to memorize things?

I know that in my parents’ time, memorization of huge lists of facts and figures and Shakespearean sonnets was standard, because that was the criteria by which knowledge was judged. But what difference does it really make if I can’t remember when the War of 1812 was? (that’s a joke, by the way). Is my experience of the things that matter in life going to be impaired because I don’t know who signed the Magna Carta? I can see how this would be a problem if a trivia game suddenly comes up while I am camping in the woods, but other than that, I don’t see why I shouldn’t outsource that to the cloud — the same way lots of people used to outsource it to Encyclopedia Britannica.

As one commenter on Google+ mentioned when I shared the Science magazine article in my stream, the benefit of having something like the Internet available at all times is that it is the most comprehensive collection of knowledge ever invented (although obviously not all of it is correct). How can that not be a good thing? Said Justin Fogarty:

The plus side is that the whole of human knowledge is nearly at our fingertips. I will not miss card catalogs, the Dewey decimal system or heavy book bags.

Computers can’t really replicate memory anyway. All they can do (so far, at least) is store facts — but facts are not memories. What real memories are made up of is smells and sounds and emotions, and no computer or cloud-based system can store those things. But what the cloud can do quite well is store my phone numbers and the photos I took on a particular day or the tweets I sent (something an app called Momento is extremely good at) and leave me free to relive the memories associated with those facts.

To me, that’s a fair trade — the cloud remembers all the boring and mundane details and facts of my life (yes, I use Facebook to remember when people’s birthdays are, as I expect a lot of people do) and I get to focus on the things that are really important.

Post and thumbnail photos courtesy of Flickr users Stefan and Tim O’Brien

  1. Whenever we inevitably start upgrading our brains with machine intelligence, first with neural chips and later with nanobots and we have access to virtually all of human knowledge natively in our brains it will be the true revolution. It won’t happen suddenly though; in increments so small that few people will stop and freak out about we’re going to turn ourselves in cyborgs.

    Share
  2. This is really an insightful study into how our memories have evolved in the face of what I am calling a “global memory” (I haven’t finished my blog piece on this yet). Where once straight memory recall was critical for survival, it isn’t any longer especially when there is a “global memory” out there that we can all tap into with our computers and smartphones and tablets. There is some real evolution going on here as it relates to the human condition.

    Here is my blog post on this whole evolution of memory:

    http://blog.jasonthibeault.com/index.php/2011/07/15/is-google-bad-for-us/

    Share
    1. Thanks, Jason.

      Share
  3. Nice piece, Mathew. My central cultural thesis is that we’re moving into a new era, one that has been called by some “the Age of Participation.” My views on this particular issue go back to 1977, when my then 7-year old daughter Jenny asked me a problematic question. This was the day of the Texas Instruments calculator, a device that so horrified educators that they were initially banned from all schools. That changed later, and my daughter had her first one at age 7. “Why,” she asked me, “do I have to learn math, if I have one of these?” It was and remains a powerful and provocative question.

    What I think is happening is this: we’re moving to an era where the participation, for example, in math has greater value that simply learning it. Of course, some will argue that you can’t participate in math without learning it, but I’m of the mind that it’s likely just a matter of shifting what we’re teaching.

    And while the world seems bent on ranking math and science as the biggies for tomorrow, I’m not convinced. That may have been true for the modern era, but I think we’ve entered a right brain renaissance, and that’s certainly not about the rules and formulas of math and science.

    Math and science, for example, won’t solve our economic crisis; that’s a job for creativity.

    But I digress…

    Share
    1. I remember my daughter asking me the same thing, Terry — thanks for the comment.

      Share
  4. We all have talents.
    I don’t have many, but as a young kid, I didn’t what I should
    remember, so I remembered everything – literally.

    A education scientist later told me I had rewired my brain in my formative years, sort of like a one-off ‘photographic memory’.
    I lost it after some minor drain bamage, as an adult.
    But in recent years, as I read that actively using our memory
    can stave off Alzheimer’s, etc, I worried again that I should try to remember a lot.
    The problem now though, is that attempting such,
    seems to slow and drain present-time cognition, and focus.

    I was relieved when Neuroscientist Bradley Voytek, replied
    to me thus: https://twitter.com/#!/bradleyvoytek/status/92328565445636096

    Share
    1. Thanks for the comment, Ed.

      Share
  5. Just get used to it. Eventually, we might end up like Johnny Mnemonic with memory upgrades in our brains……..that is, assuming the artificial intelligence does awaken first!!!

    http://mythoughtsontechnologyandjamaica.blogspot.com/2011/07/columbia-u-says-google-erodes-memory.html

    Share
  6. I have to somewhat disagree. On the surface, yes of course it might be beneficial that be can look trivia up instead of having to remember it. but below this is the question if we will still be able to define where trivia ends and where significant information starts. As a designer, I find that sometimes the most random connections are the ones that lead me towards breakthrough moments. It is because I do not go for the obvious, but use non-linear thinking. If we would all streaming the process of storing the information in our brains, and we would only store the “most important” information (whatever the definition for that might be), these connections would be lost. there would be no more random connections. And this would have a major impact on the way our creativity works. It might not even work anymore (ok, I admit, I am exaggerating here). But the point is, that it is the random bits of information, that help us see things from different angles, in unexpected ways. And outsourcing this information to a cloud memory might leave us without these random bits.

    Share

Comments have been disabled for this post