Stay on Top of Enterprise Technology Trends
Get updates impacting your industry from our GigaOm Research Community
The word “visionary” gets thrown around a lot, but Douglas Engelbart — who passed away earlier this week — arguably deserved that title more than most. His research into how computers could augment our ability to think and collaborate led to the development of the computer mouse and many of the other innovations we associate with personal computing. At a time when innovation in Silicon Valley often seems to consist of inventing a new iPhone app or another walled garden for showing people ads, it’s worth remembering that we need those who are thinking farther ahead than just their next round of financing.
He was best known for inventing the mouse, but that was just one tiny part of what Engelbart contributed to the computing revolution. And his vision was not centered on what kinds of products he could develop and how they could be monetized — or how he could patent his innovations and charge others for using them — but on what they could allow people to do and how they could increase our ability to understand the world around us.
A vision of a better world
Engelbart — who was influenced by Vannevar Bush’s visionary 1945 essay “As We May Think,” and the idea of a personal workstation with all the world’s knowledge connected to it — believed that only by collaborating via computers would scientists be able to process the vast amounts of information needed to solve the problems of the modern world. As he put it in one of his manifestos, prepared for the Air Force Office of Scientific Research in 1962:
“By augmenting human intellect we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems. Increased capability in this respect is taken to mean a mixture of the following: more-rapid comprehension, better comprehension, the possibility of gaining a useful degree of comprehension in a situation that previously was too complex, speedier solutions, better solutions, and the possibility of finding solutions to problems that before seemed insoluble.”
With the incredibly powerful technologies we have in our hands — wirelessly-networked supercomputers many of us use to look at funny pictures or play Angry Birds — it’s almost impossible to fathom just how far ahead Engelbart was thinking when he started the Augmentation Research Center and began developing the technologies that he demonstrated in 1968, in a legendary presentation at Stanford University often called “The Mother Of All Demos” (video of which is embedded below). At a time when computers were still room-sized behemoths that used punch cards, Engelbart was already thinking of a network of individually-controlled desktop computers and what it could do.
Another “Facebook, but for X”
As former Apple designer and developer Bret Victor points out in a post about Engelbart’s legacy, despite almost a half a century of innovation around computing and collaborative technologies, we have still only scratched the surface of what he originally had in mind. And while it’s difficult to say for sure what he would have thought of Facebook or the iPhone, it’s not hard to imagine that he might have been disappointed by how little they take advantage of what is possible with a vast network of interconnected users, just as Sir Tim Berners-Lee is concerned about what we have managed to do with the web.
It’s not that building commercial technologies that people can use to buy things or play games is bad, but when you look at the sheer size and scale of the resources devoted to those things versus the amount of money and time spent on collaborative intelligence or other tools that Engelbart dreamed of, it can get depressing. Do we really need another “Like Facebook, but for X?” or “Like Pinterest, but for Y?” Do we need to spend as much time as we do patenting things like rounded corners on smartphone icons?
It’s easy to blame the venture-capital industry for failing to put enough money behind truly innovative technologies or concepts — and on that point it’s worth noting, as Tom Foremski pointed out, that Engelbart struggled to get funding for much of his life — but VCs have their eyes on the bottom line just as many entrepreneurs do, and it often seems as though the easiest way to generate a blockbuster product or service is to imitate a previous blockbuster. So why not just do that, if that’s where the money and/or the glory is?
Where have all the visionaries gone?
Even Steve Jobs, who turned Apple into one of the most valuable companies in the world, seemed more like Engelbart in some ways than he did a modern CEO or entrepreneur: yes, he commercialized research from Xerox’s PARC research center (much of which was based on Engelbart’s work) but he always seemed to be motivated more by a vision of the future and his own personal design aesthetic than by dreams of how much revenue he could generate or how much advertising he could help Madison Avenue sell.
Clearly there are still those who qualify as technology visionaries — a group that would have to include Elon Musk of Tesla and probably Jeff Bezos of Amazon. And Om is right when he says that there is innovation happening on the fringes of Silicon Valley, whether it’s research into using big data for genomic analysis, or augmented reality, or new chip technologies.
But at the same time, we seem to have lost the kind of big-picture thinking that Engelbart and others specialized in: the kind that didn’t have as its focus a specific market where things could be sold, or a better way of targeting advertising, but a vision of a world that could be changed for the better in some dramatic way through the use of technology — and a way of connecting the dots to show us all how we might be able to get there. And we are poorer for that.
Post and thumbnail images courtesy of Shutterstock / alphaspirit