7 Comments

Summary:

As forward-thinking as Apple is, perhaps the company hasn’t forged a new path at all. Through a technique of observe, perfect and discard, Apple has been heading for some time in one direction — along the pre-defined path into the era of ubiquitous computing.

Three Human to Computer Ratios

There is definitely some sort of Zen that makes Apple, well, Apple. Taking something obvious, and making it somehow better, somehow cooler, somehow new. How do they do it? By observing how consumers interact with technology and experimenting ad nauseum internally, until they get it exactly the way they want it. This includes abandoning the past if it no longer makes any sense. So when Apple’s own Tim Cook declares that merging a refrigerator and a toaster is not good for the consumer, he may just have a point.

But as forward-thinking as the company is, perhaps Apple hasn’t created a new path at all. Through a technique of observe, perfect and discard, Apple has been heading for some time now in one direction — along the pre-defined path into the era of ubiquitous computing.

Ubiquitous computing defined

In some ways, this path is as logical as Moore’s Law. Look at the history of computing — from the mainframe era where there was one computer for many consumers, to the personal computing era where there was one computer for each consumer, to this new era where there are many computers for each consumer — and compare of the number of computer chips to the number of consumers using those chips. At its foundation, ubiquitous computing could be summed up by this simple principle of ratios.

The modern concept of ubiquitous computing originally came from Mark Weiser in 1988 from the Computer Science Lab at Xerox PARC (sound familiar?). The theory proposed a seamless, almost invisible connection between consumers and computers that would help drive a change in ratios from one computer to many people, to many computers to one person.

Three Human to Computer Ratios

Apple’s tabs, pads and boards

Even considering the most radical interpretation of ubiquitous computing dust, the main point has remained the same. We will soon be overrun by computer chips. There are, however, three very distinct platforms in this well-defined post-PC era that we have all become accustomed to. Not unlike the three platforms we see evolving within the iOS platform today:

Gesturing tabs: Mobile technology already had small chips, powerful batteries, geolocation services and wireless networking. But that was not enough to win over the masses and drive us all to purchase multiple computing devices. It was the way consumers interacted with these smaller devices that needed to change. For a long time, it was thought that voice recognition was going to propel us into the next era of computing, but that never happened.

Leveraging the fact that there were approximately 100 million iPod users, Apple was able to use convergence to its advantage as it introduced these iPod users to a series of simple touch-based gestures on a nearly buttonless device. In the early years of the iPod, we all were trained on the scroll wheel. With touch-based gestures on a wide open screen, this paradigm was taken one step further. Just as the mouse accompanied the transition from the terminal-based Mainframe Era to PC era, the post-PC era was ushered in by a new way of interacting with other computer chips, touch.

Multi Touch Gestures

Revolutionary pads: As soon as people became familiar with this new way of interacting with computers, it was time to challenge the personal computer paradigm directly. Netbooks attempted to continue the personal relationship with consumers by maintaining the 1:1 ratio. Tablets such as the iPad are more specialized and were never meant to be a total replacement for a traditional and general purpose personal computer.

The rapid rise and immediate success of the iPad was proof positive that consumers were ready for a third major computing device in their lives. With “pads” being used by pilots, students and doctors and in restaurants, kitchens and at work, the iPad was proving to be a specialized place-based appliance rather than a personal computer with a more general purpose. As powerful as the third generation iPad is, it will never replace the personal computer, just as the personal computer never really replaced the mainframe.

Patent Number 8018579

Experimental boards: The current AppleTV may be a stretch to accept as a computing platform as it has no keyboard, no mouse, no touch display, and just a very simple IR remote. That is unless you happen to be near one with a Mac or iOS device. Then the AppleTV becomes an extension of that device on a much larger screen. Although it is marketed along side the iPod, it is just as closely related to the Airport Express. Perhaps Apple needs to look towards Nintendo’s Wii or Microsoft’s Kinect, otherwise the AppleTV will be doomed as just an accessory to their Tabs and Pads.

Take a look at what HBO has done with the XBox Kinect as an example. If Apple’s recently awarded gesture based patents are any indicator, this may be where it is headed as well. The interaction between consumer and computer chip has not been ironed out enough to fully see this final platform — the boards of ubiquitous computing — take hold of our day-to-day life.

One human relation-chip

Making each device “aware” of how consumers use all of the other devices they own is the key to accelerating the adoption of more than one computing device. While Apple may in fact be the only company in the world to have constructed a homogeneous synergy between its personal and its ubiquitous computing platform, it is certainly not the only company trying to forge the relationship between the user and the computer chip. For the relationship between consumers and computing devices to become truly invisible, these new smart devices will need to know more and more about the consumers who own them.  For instance, the devices will need to know everything consumers have done in the past, what they are doing now and even what they plan on doing later.

iCloud and Siri

Perhaps this is the reason Tim Cook stated that Apple’s “best years lie ahead of us.”  With technologies like iCloud and Siri, Apple will likely play a larger and larger role in forging the relationship between consumers and the growing number of computing devices in our daily lives. It is not about selling more of these individual devices, it is all about enabling the relationship between an individual and a collection of specialized devices.  And Apple knows this.

You’re subscribed! If you like, you can update your settings

  1. I’m all but certain that tablets won’t supplant PCs. They’re simply too good as a personal device that complements a full PC system, if not as implemented today. It would basically function as a second PC screen and personal cloud, storing all of a user’s stateful information like browser tabs, app settings, documents, etc. It wouldn’t matter what system you’re attached to, if any.

  2. Reblogged this on Benjomin and commented:
    Very interesting and thought provoking. Have to admit I am a huge Apple fan and always look forward to new products that they put our way.

  3. Nice, well thought article. It seems that personal computer technology has come full circle considering the historical path from mainframes and dumb terminals now to mainframes marketed as”‘cloud/s” fortunately nowadays ergonomic with slick interface/s, from their predecessors. We are well in the post-PC-era. PC as we knew them are no longer necessary to most. Many including many of my friends and associated have very little use for a PC with the exemption of very specialized turnkey applications / solutions where one can argue the computer is only a part of the platform and nor necessarily the heart of it.

    Already most do not refer to their vehicle, heating system or smartphone as a computer, if ever, although many are well aware that those technologies are computer based. Soon your “computer” might no longer suffer from the limitations of the connotations “computer” inherits because simply how it does what it does is not as important as to what it actually accomplishes.

  4. Tablets will become more powerful, just as cell phones have done and continue to do. Also, everything is getting faster.
    Perhaps the next ratio will be back to 1:1 as people adopt Google glasses, and then 0:1 if in the distant future we move to chips implanted in our brains. Sounds very 1984 now, but then, 1984 sounded very 1984 when it came out in 1949, and is far less so now. Don’t think we’re headed for implants? Read Ray Kurzweil.

  5. Per Clay Richardson and Horace Dedui, it not relationships, it’s not demographics, it’s not markets – its technology to allow people to jobs they could not do in the past or did not realize they wanted to do. This was a big part of Apples magic.

  6. I just spent over $1800 in a PC. I still own 2 macs but I am getting a divorce from Apple. For me the whole thing started when Apple F.. me over with Final Cut Pro. I will be using Adobe for editing videos and ProTools for audio. Bye Bye Apple. Look at my finger, yeah the middle one.

  7. This is an excellent article, thank you. Especially interesting to this reader was the part about ubiquitous computing having its meme roots in PARC. This makes a lot of sense – simply open a terminal shell in OS X Lion and check out the “Ubiquity” folder in each user’s home directory:

    ls ~/Library/Application\ Support/Ubiquity/

    Ubiquity (iCloud) sockets anyone?

    It will be interesting to see if ubiquitous computing, having roots at PARC, will combine with Aspect Oriented Programming (also from PARC in the 1990s per Dr. Chris Maeda et al) in the future of development of ubiquitous application development!

Comments have been disabled for this post