No one disputes the transformative change that desktop and mobile computing platforms have had on human communication in recent years. While the hierarchy of these devices is continuously evolving, phones are increasingly replacing desktops and laptops as the primary computing device in our digital lifestyle. Google found that more people have a mobile internet-capable device than a PC or laptop in five key markets (U.S., UK, Germany, France and Japan). In the U.S., this figure is eight percent greater.
We’re also seeing growth in a complementary concept — let’s call it Nomadic Computing 2.0. In this model, the phone itself plays a central role in powering computing beyond its own small form factor. As the vice president of product design for the mobile media company Zumobi, I’ve noticed an interesting trend in the next generation of mobile user experiences. Increasingly, the functions and user interface scenarios that were once the purview of our other devices (PCs, laptops, TVs) are being powered directly by the computing hardware and software within our phones.
Devices like the television, once at the center of our media lifestyle, will soon be relegated to temporary output displays provided for the phone. We can already see the first signs of this in the novel applications of Apple’s AirPlay technology. Playing Real Racing 2 on your phone? Have an Apple TV? With a few taps, the driver’s-eye view is sent wirelessly to the TV, and the phone itself provides the input hardware (an accelerometer-powered steering wheel) and displays useful secondary content (like the racetrack route). Here’s the surprising twist — in this scenario, both of these screens are being computed and rendered by the phone. The Apple TV is just passing through a stream of pixels for presentation on the TV’s larger display. This scenario illustrates how the iPhone and the iPad allow developers to design and render completely different user interfaces when connected to a large display.
The Nomadic Computing 2.0 model leverages the user’s superphone as a central computing core that powers a variety of user interface hardware. In other scenarios, the user’s superphone plugs into purpose-built forms appropriate to the task at hand. Recent Android-based products from Motorola (the Lapdock) and Asus (the Padfone) enable the phone to assume the form factor of a laptop or a tablet, respectively. Again, in the case of these products, the companion phone doesn’t merely dock with an existing laptop or tablet… it becomes each of these forms, with the help of a lightweight, modular screen, battery and keyboard.
The appeal of Nomadic Computing has long been explored by research labs and startups through such projects as IBM’s early MetaPad and OQO’s early line of ultraportable PCs. These ambitious efforts were ultimately hindered by their size (too big), price (too spendy), battery life (too short), cellular broadband (too slow) and software architectures (too complex). Modern superphone-based approaches such as those from Motorola and Asus address each of these limitations through advances in modern phone hardware.
In the realm of iOS, one can imagine a future iPad as nothing more than a slate-sized display “dock” into which you insert your iPhone. Without computing or networking of its own, such a device would be inexpensive and lightweight. Notably, even today, when a user installs a dual-binary iPhone/iPad app (indicated by a “+” in the app store), the software bundle includes all of the code and graphical assets required to render both the iPhone and iPad versions of that app. Is Apple planning ahead for future products that leverage Nomadic Computing technologies? You decide.
The device-centric model of Nomadic Computing 2.0 will not be for everyone. Many will instead choose a cloud-based continuous client model as envisioned by Joshua Topolsky. OnLive and similar services anticipate an even more radical future. In this future, the thin client device receives pre-rendered pixels streamed from a server in the sky. These emerging computing models have clear merits, and, together with Nomadic Computing 2.0, they will enable new innovations in user experience.
However many models of computing emerge throughout 2012 and beyond, it’s evident that the phone is well on its way to usurping the throne from its PC overlords and becoming the center of our digital lives. As with previously disruptive technologies, these new models of computing will have dramatic implications on the collective businesses of software, hardware, services, media, advertising and entertainment.
John SanGiovanni is the vice president of product design for the mobile media company, Zumobi. Prior to Zumobi, John was program manager and technical evangelist for Microsoft Research, where he managed external academic research funding in the areas of mobile technologies and user interfaces.