Blog Post

The phone is poised to usurp the PC’s throne

No one disputes the transformative change that desktop and mobile computing platforms have had on human communication in recent years. While the hierarchy of these devices is continuously evolving, phones are increasingly replacing desktops and laptops as the primary computing device in our digital lifestyle. Google found that more people have a mobile internet-capable device than a PC or laptop in five key markets (U.S., UK, Germany, France and Japan). In the U.S., this figure is eight percent greater.

We’re also seeing growth in a complementary concept — let’s call it Nomadic Computing 2.0. In this model, the phone itself plays a central role in powering computing beyond its own small form factor. As the vice president of product design for the mobile media company Zumobi, I’ve noticed an interesting trend in the next generation of mobile user experiences. Increasingly, the functions and user interface scenarios that were once the purview of our other devices (PCs, laptops, TVs) are being powered directly by the computing hardware and software within our phones.

Devices like the television, once at the center of our media lifestyle, will soon be relegated to temporary output displays provided for the phone. We can already see the first signs of this in the novel applications of Apple’s AirPlay technology. Playing Real Racing 2 on your phone? Have an Apple TV? With a few taps, the driver’s-eye view is sent wirelessly to the TV, and the phone itself provides the input hardware (an accelerometer-powered steering wheel) and displays useful secondary content (like the racetrack route). Here’s the surprising twist — in this scenario, both of these screens are being computed and rendered by the phone. The Apple TV is just passing through a stream of pixels for presentation on the TV’s larger display. This scenario illustrates how the iPhone and the iPad allow developers to design and render completely different user interfaces when connected to a large display.

The Nomadic Computing 2.0 model leverages the user’s superphone as a central computing core that powers a variety of user interface hardware. In other scenarios, the user’s superphone plugs into purpose-built forms appropriate to the task at hand. Recent Android-based products from Motorola (the Lapdock) and Asus (the Padfone) enable the phone to assume the form factor of a laptop or a tablet, respectively. Again, in the case of these products, the companion phone doesn’t merely dock with an existing laptop or tablet… it becomes each of these forms, with the help of a lightweight, modular screen, battery and keyboard.

The appeal of Nomadic Computing has long been explored by research labs and startups through such projects as IBM’s early MetaPad and OQO’s early line of ultraportable PCs. These ambitious efforts were ultimately hindered by their size (too big), price (too spendy), battery life (too short), cellular broadband (too slow) and software architectures (too complex). Modern superphone-based approaches such as those from Motorola and Asus address each of these limitations through advances in modern phone hardware.

In the realm of iOS, one can imagine a future iPad as nothing more than a slate-sized display “dock” into which you insert your iPhone. Without computing or networking of its own, such a device would be inexpensive and lightweight. Notably, even today, when a user installs a dual-binary iPhone/iPad app (indicated by a “+” in the app store), the software bundle includes all of the code and graphical assets required to render both the iPhone and iPad versions of that app. Is Apple planning ahead for future products that leverage Nomadic Computing technologies? You decide.

The device-centric model of Nomadic Computing 2.0 will not be for everyone. Many will instead choose a cloud-based continuous client model as envisioned by Joshua TopolskyOnLive and similar services anticipate an even more radical future. In this future, the thin client device receives pre-rendered pixels streamed from a server in the sky. These emerging computing models have clear merits, and, together with Nomadic Computing 2.0, they will enable new innovations in user experience.

However many models of computing emerge throughout 2012 and beyond, it’s evident that the phone is well on its way to usurping the throne from its PC overlords and becoming the center of our digital lives. As with previously disruptive technologies, these new models of computing will have dramatic implications on the collective businesses of software, hardware, services, media, advertising and entertainment.

John SanGiovanni is the vice president of product design for the mobile media company, Zumobi. Prior to Zumobi, John was program manager and technical evangelist for Microsoft Research, where he managed external academic research funding in the areas of mobile technologies and user interfaces.

Image courtesy of Flickr user ajmexico.

9 Responses to “The phone is poised to usurp the PC’s throne”

  1. Peter James Herz

    This ‘replacing’ rhetoric is getting played out. Lets admit that phones are apart of a plural technological destiny that is about emerging and sharing means not ‘replacing.’ Workstations will always have their place in the home, office, etc. Phones will have their places in everywhere else. The advantages and disadvantages of both platforms are well-known and not likely to reconcile suddenly by phones ever catching up with desktop evolution. Tablets might come close but even they’ll be behind desktops in performance for everything other than consuming content.

    • Succinctly said. Not everyone compiles the Linux kernel (or performs such computing tasks) on a daily basis. For them, a mobile computing device with “the best display device in the vicinity” is a cool option. But there will always be people who will need a “propah PC”. A car cannot replace a truck or a tractor, but most of us use cars most of the time.

  2. Good article John. I’ve written similar thoughts a few times on my blog Dave Enjoys. I look forward to these nomadic devices as you call them, or as I termed ‘unified smart device.’ Whatever they may be called, I love the idea of prices going down as we no longer need a full tablet – just a shell that our phone slides into, etc.

  3. Article starts out on the right track, but then loses its way. Why, oh why, have just a single instance of computational power, whether it is a tablet being a dock for the phone, or all devices being dumb terminals for the cloud? In an era of rapidly falling hardware costs, why take standalone computational capacity out of ANYTHING? Why not cluster adjacent devices (and the cloud) instead? A phone next to a tablet has the computational power of both combined, and sends even heavier (or less immediate) work to the cloud and/or the user’s home server. The TV box not only streams pixels but also boosts the quality of the graphics. Etc, etc. Much more flexible that way if you only have one device or a device (or the Internet connection) dies on you, also.

    • Power users will always appreciate & leverage the improved stability, performance, and flexibility of more complex & distributed architectures. However, history teaches us that simplicity has tremendous appeal for an enormous segment of the user base. For these users, the NC2 model eliminates the need for multiple systems, duplication of hardware, and redundant wired/wireless networks.

      • Steve S.

        Thanks bro great article, thanks for including me in your comments. I was able to view my paycheck info & print the info out alot faster on my smart phone then my computer. Thanks for the great suggestions & support.

    • I agree with aepxc’s comments — article starts off on the right foot then the author gets swept up into uni vision, I particularly stopped reading when the author wrote, “…imagine a future iPad as nothing more than a slate-sized display “dock” into which you insert your iPhone…” — that’s just plain stupid. As Sun used to say, “the network IS the computer” (hence to aepxc’s, leverage multi-device and cloud computing together as in distributed computing and ubiquitous computing). Duh. Sigh.