Perhaps the strongest criticism of the iPhone has been that it doesn’t support multitasking, aside from a few of Apple’s own system level applications that are included on the device and can’t be deleted. Yet the iPhone sells like hotcakes, and Apple has a technical solution that essentially accomplishes the same thing, called background notifications. If multitasking is so important, as the critics, pundits and technology bloggers will tell you, why have the iPhone and its sibling the iPod touch become two of the most successful electronics devices of all time?
Because the technology press and hardcore technology users have an unprecedented platform from which to speak and be heard. Period. End of story.
Last week’s iPad announcement made this abundantly clear. The technosphere has labeled the iPad an unqualified failure, in large part due to lack of multitasking. News flash: multitasking is overrated. Its not nearly as important to average, everyday users as it is to the people who cover technology for a living. Despite the fact that Palm’s WebOS and Google’s Android both support multitasking, neither has come anywhere close to the success of the iPhone.
With the iPhone and now the iPad, Apple is clearly targeting a mass consumer audience. Many of these users aren’t comfortable with computers. They use them almost because they have, for email and a few other core tasks. Obviously this is changing, as the number of computer and Internet users continues to grow. Its not because computers and the Internet are incredibly easy to use, because they aren’t. In fact, the difficulty in using computers has probably slowed adoption of computing and Internet services into consumers’ daily lives, and part of that complexity comes from multitasking.
Here are three observations that also lead me to believe that multitasking just isn’t that important to most people.
- I have facilitated or observed literally thousands of web usability test sessions over the last several years. In watching people use computers and the web, I’ve noticed three very specific behaviors: 1) most people instantly maximize windows to fill their screens and minimize distractions; 2) only the most tech savvy users use alt-tab (Windows) or command-tab (Mac) to switch between apps; and 3) people are far more likely to be confused when multiple windows and apps are open.
- There has been a surge in interest in the last few years for desktop applications that take over the screen. This is true of Firefox, for example, which has a full-screen “kiosk” mode, and several word processors designed to let users write without distraction.
- Despite pretty regular usage, my wife still struggles with some basic Mac operations related to multitasking, such as closing windows as an attempt to quit an app, switching between apps, not realizing which window is active, etc. While she still uses the Mac, she has moved more and more of her computing activity to her iPhone because she doesn’t have these same issues.
Sure, many of us heavy users like multitasking on our computers and might not feel nearly as productive without it (I say feel because there is evidence to suggest that we aren’t really multitasking but fast switching, and performance suffers when we do). But the majority of people in the world aren’t like us. They want something that is really easy to use and understand, and that provides some level of enjoyment or helps make their lives easier. Apple’s iP products (iPhone, iPod Touch, iPad) are designed for these people.
What Apple is really doing is making technology disappear, surfacing content in a very human way. Even if processing power and battery life are currently capable of delivering multitasking, I’m not sure Apple will implement it in the way we think of multitasking today. Perhaps it will allow background processing and easier switching among apps, which get at core user needs, but I expect it will maintain a solotasking approach well into the future of its product designs.