9 Comments

Summary:

Computing pioneer Doug Engelbart — who passed away earlier this week — had a vision of how the technologies he developed could help us to create a better world, not just a way to sell more smartphone apps or get people to click on advertising.

shutterstock_121263391

The word “visionary” gets thrown around a lot, but Douglas Engelbart — who passed away earlier this week — arguably deserved that title more than most. His research into how computers could augment our ability to think and collaborate led to the development of the computer mouse and many of the other innovations we associate with personal computing. At a time when innovation in Silicon Valley often seems to consist of inventing a new iPhone app or another walled garden for showing people ads, it’s worth remembering that we need those who are thinking farther ahead than just their next round of financing.

He was best known for inventing the mouse, but that was just one tiny part of what Engelbart contributed to the computing revolution. And his vision was not centered on what kinds of products he could develop and how they could be monetized — or how he could patent his innovations and charge others for using them — but on what they could allow people to do and how they could increase our ability to understand the world around us.

A vision of a better world

Engelbart — who was influenced by Vannevar Bush’s visionary 1945 essay “As We May Think,” and the idea of a personal workstation with all the world’s knowledge connected to it — believed that only by collaborating via computers would scientists be able to process the vast amounts of information needed to solve the problems of the modern world. As he put it in one of his manifestos, prepared for the Air Force Office of Scientific Research in 1962:

“By augmenting human intellect we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems. Increased capability in this respect is taken to mean a mixture of the following: more-rapid comprehension, better comprehension, the possibility of gaining a useful degree of comprehension in a situation that previously was too complex, speedier solutions, better solutions, and the possibility of finding solutions to problems that before seemed insoluble.”

With the incredibly powerful technologies we have in our hands — wirelessly-networked supercomputers many of us use to look at funny pictures or play Angry Birds — it’s almost impossible to fathom just how far ahead Engelbart was thinking when he started the Augmentation Research Center and began developing the technologies that he demonstrated in 1968, in a legendary presentation at Stanford University often called “The Mother Of All Demos” (video of which is embedded below). At a time when computers were still room-sized behemoths that used punch cards, Engelbart was already thinking of a network of individually-controlled desktop computers and what it could do.

Another “Facebook, but for X”

As former Apple designer and developer Bret Victor points out in a post about Engelbart’s legacy, despite almost a half a century of innovation around computing and collaborative technologies, we have still only scratched the surface of what he originally had in mind. And while it’s difficult to say for sure what he would have thought of Facebook or the iPhone, it’s not hard to imagine that he might have been disappointed by how little they take advantage of what is possible with a vast network of interconnected users, just as Sir Tim Berners-Lee is concerned about what we have managed to do with the web.

It’s not that building commercial technologies that people can use to buy things or play games is bad, but when you look at the sheer size and scale of the resources devoted to those things versus the amount of money and time spent on collaborative intelligence or other tools that Engelbart dreamed of, it can get depressing. Do we really need another “Like Facebook, but for X?” or “Like Pinterest, but for Y?” Do we need to spend as much time as we do patenting things like rounded corners on smartphone icons?

It’s easy to blame the venture-capital industry for failing to put enough money behind truly innovative technologies or concepts — and on that point it’s worth noting, as Tom Foremski pointed out, that Engelbart struggled to get funding for much of his life — but VCs have their eyes on the bottom line just as many entrepreneurs do, and it often seems as though the easiest way to generate a blockbuster product or service is to imitate a previous blockbuster. So why not just do that, if that’s where the money and/or the glory is?

Where have all the visionaries gone?

Even Steve Jobs, who turned Apple into one of the most valuable companies in the world, seemed more like Engelbart in some ways than he did a modern CEO or entrepreneur: yes, he commercialized research from Xerox’s PARC research center (much of which was based on Engelbart’s work) but he always seemed to be motivated more by a vision of the future and his own personal design aesthetic than by dreams of how much revenue he could generate or how much advertising he could help Madison Avenue sell.

Clearly there are still those who qualify as technology visionaries — a group that would have to include Elon Musk of Tesla and probably Jeff Bezos of Amazon. And Om is right when he says that there is innovation happening on the fringes of Silicon Valley, whether it’s research into using big data for genomic analysis, or augmented reality, or new chip technologies.

But at the same time, we seem to have lost the kind of big-picture thinking that Engelbart and others specialized in: the kind that didn’t have as its focus a specific market where things could be sold, or a better way of targeting advertising, but a vision of a world that could be changed for the better in some dramatic way through the use of technology — and a way of connecting the dots to show us all how we might be able to get there. And we are poorer for that.

Post and thumbnail images courtesy of Shutterstock / alphaspirit

  1. Steve Ardire Friday, July 5, 2013

    Nice post Mathew

    As he put it in one of his 1962 manifestos “By augmenting human intellect we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems…..”

    Yes finally 50 years later we’re now beginning to address this with cognitive computing solution but no so much with more hyped Big Data solutions.

    Share
    1. Mathew Ingram Friday, July 5, 2013

      Thanks, Steve.

      Share
  2. While it’s true that big names are afraid to take risks (almost everybody) , some are greedy and cripple their products to better bleed their customers dry (Apple and even Google) and on the mobile side the gold rush is annoying ( the market will mature , a bit of initial chaos is not surprising), the landscape is not all that bleak.
    The internet is growing, bringing more people online is fantastic, developing nations are rising and a lot of great things could come from them.
    Online education and crowdfunding could facilitate a great wave of innovation.
    There are also a few emerging things that can change the world in a big way like 3D printing, energy harvesting and space exploration.

    In the end the tech world is only as functional as the whole world (or country) and that’s a bit of a problem nowadays when the laws are not made by the people,for the people anymore and safeguards that are supposed to ensure the survival of our society are ignored.

    PS: got to point out the irony here, Payed Content a sister site that focuses on a concept that just can’t be the future (that became painfully clear more than a decade ago) and even the paywall here….

    Share
  3. I agree with the sentiment that: “The best minds of my generation are thinking about how to make people click ads. That sucks.” However, if there’s any silver lining, it’s that the byproduct of solving that problem is fabulous technology that can and will be used by other to solve more intriguing questions. I guess my gripe with modern the modern Silicon Valley attitude is that once someone makes enough money selling people ads, they don’t emulate someone like Elon Must and use that money do attack more interesting problems; they create another start-up that’s like “Facebook for X” or “Pinterest for Y”.

    Share
  4. ‘The Mother of all Demo’s’, ‘The First Internet Marketing Conference’ with Marc Andreessen, plus ‘MacWorld 2007′ with the iPhone Launch, are the greatest demo’s ever.

    Not only did each demo show us the future of technology, but each one changed the world.

    Share
  5. Visionaries are still here – Consider that the global community of Innovation Games Certified Collaboration Architects are using in-person and online games to help cities from San Jose, CA to Aalbeke (Kortrijk) Belgium use serious, collaborative games to create a better future.

    Share
  6. ProbablyMySecondAccount Friday, July 5, 2013

    > And while it’s difficult to say for sure what he would have thought of Facebook or the iPhone

    Am I missing something? He did just die this week, right? He probably had thoughts on Facebook and the iPhone. Hopefully he’s been interviewed in recent years.

    Share
  7. I love love love Google: Glass, driverless cars, ubiquitous broadband, etc

    Share
  8. Well to start with you media people should stop glorifying the 17 year old webmasters turned millionaires. Don’t blame others.

    Share

Comments have been disabled for this post