Why Apps Need Some Sense and Sensibility

brainstorm

When it comes to my iPhone (or any smartphone for that matter), the biggest frustration I have is when the phone switches between Wi-Fi and 3G networks and just hangs. The data connection enters a weird state of “hang.” The same catatonic network status returns when switching between two Wi-Fi networks. Instead of finding the strongest network, you are stuck on a network that is weak at best.

One would think by now we would have figured out this hand-off problem, right? Wrong. As more and more Wi-Fi networks come into our lives, the hand-off problems are becoming worse and worse. A group of researchers at Massachusetts Institute of Technology got so frustrated they started working on solving this problem.

In doing so, they’ve come up with a set of new communication protocols that use information about a smartphone’s movement to improve handoffs. In experiments with these protocols, they decreased the need of portable devices to switch networks by 40 percent and improved the throughput by 30 percent. These protocols bring about many other network improvements, but that’s not the story.

The Sensory Overload

The real story is how these MIT researchers — graduate student Lenin Ravindranath, Professor Hari Balakrishnan, Associate Professor Sam Madden, and postdoctoral associate Calvin Newport, all of the Computer Science and Artificial Intelligence Laboratory — used various mobile phone sensors such as GPS, accelerometers and gyroscopes and took that data to solve a problem.

Balakrishnan jokes that the protocols came as a result of their own annoyances with the network problems, but he’s hopeful these protocols are going to be widely adopted by others.

To me, this usage of sensors to build an application that solves a common problem offers a futuristic view of what mobile apps could do. And in the process, it could bring about higher level of engagement. Sure there are some apps — like some gaming apps on the iPad — that leverage the sensors on the device, but most apps today are still nowhere close to capitalizing on the capabilities of these devices.

So far, apps that use single-sensor inputs, such as the GPS, microphone or the camera, have generated tons of excitement. Now imagine many of these (and other) sensors working in tandem and the experiences created on top of this sensor mash-up.

In his research role, Balakrishnan had been involved in the Pothole Project, which essentially used the data from the sensors to figure out all the potholes in the Boston area and plotted them on a map. That’s a clever use of sensor data from mobiles for building a web-based application. Now imagine taking that entire sensor input and making it part of an app experience.

Philippe Kahn, a veteran entrepreneur and co-founder of MotionX, described this sensor-enriched environment: “The motion-aware mobile platform is the new media.” His company uses variants of the principles articulated by Balakrishnan in its apps such as Motion X-GPS and Motion X-GPS Drive.

Motion Magic

Balakrishnan doesn’t see why there couldn’t be other applications built that are able to decipher our common motions — walking, sitting, commuting — by taking data from various sensors. This activity layer built on top of sensors can provide much-needed context, and in the process, make apps more engaging and give them a touch of serendipity.

Jeff Jonas, an IBM researcher and one of the keynote speakers at our Structure Big Data conference, often says machines inside corporations need to understand the who-where-what-when-and-why in order to get a better grip on the explosion of data and benefit from it. The iPhones and iPads are no different.

The mobile phone is not made for textual interactions, but instead, it is one, which has similar visual and contextual capabilities as we have. To achieve that goal, the app developers need to think differently and use sensor data inputs as a core building block of their overall user experience, just as they do with the data that comes from the social graph.

When we think of mobile phones, we need to stop thinking of them as computer-like devices, and instead, think of them as extensions of us. The mobile machine in our hands needs to understand what’s happening in our lives and factor that into experiences based on those inputs. In a post last year, I asked the question, can mobile phones think?

If they don’t, they will soon becoming tools of interruption and thus annoyance –- much like the irritation felt by the MIT researchers when bad network handoffs prevented them from getting their email.

Uh oh!

Something is wrong with your Wufoo shortcode. If you copy and paste it from the Wufoo Code Manager, you should be golden.

loading

Comments have been disabled for this post