For the past week or so, our little corner of the Internet has been abuzz with news of mobile apps uploading iPhone address books without asking us — the iPhone (and address book) owners. It all started with Path, a much talked about iPhone app that offers a very limited private social network. A programmer/blogger in Singapore discovered that the San Francisco startup co-founded by Dave Morin, an early Facebook employee, was uploading and storing the address book data to its servers.
In the wake of ensuing outrage, the company changed its policies and is now asking for explicit permission. It also issued an apology and deleted the data uploaded to its servers. As hours (and then days) went by, it was obvious that Path wasn’t the only company or mobile app that was indulging in this behavior. Other apps, such as photo-sharing service Instagram, quickly changed their policies and issued software updates.
The whole issue seems to have boiled over with Nick Bilton’s piece in the New York Times — So many apologies, so much data mining. Today, we have seen a lot of discussion, including some like angel investor Chris Dixon wondering why is Apple allowing this behavior and not forcing apps to ask for permission? These are all legitimate arguments and counter arguments. However, the most important question is, what do we learn from all this and where do we go from here? What is that question we should be asking ourselves?
Do The Right Thing
Apologies, outrage, anger and blaming Apple for not being more strict overlooks one of the most important aspects — the moral imperative. As I sit here are watching the white-hot arguments, I wonder why we aren’t talking about doing the right thing. Why do I bring this up? Today’s apps are inherently more social and thus by extension more human. The relationships on this social web are going from increasingly virtual to more real. In a sense, these apps have started to reflect our daily lives. As many have said before, we are the social web and the social web is us.
Dave Winer in his post, How industries react to crisis writes:
It’s time to make this change in tech, once and for all. Your products are not toys, they are used seriously by real people. You need to show respect for your product, and that means respect for your users.
Our daily lives have many layers of trust built into them. There is an implicit social contract that implies that trust. Doing business with your bank, dry cleaner, green grocer and coffee shop is built on that trust. We are friends with others whom we trust. We work with people we trust. And that trust is what drives us to do the right thing. As my colleague Mathew Ingram wrote in his piece, Lessons from Path and Pinterest:
The lesson here is that for social apps, the trust of users is paramount, and the best way to maintain that trust is to be as open as possible about everything that is occurring, particularly if it involves a user’s personal data. Whatever you are doing with it may not seem like a big deal to you, but better to be open about it than have it revealed by someone else, at which point you look sneaky. As Craigslist founder Craig Newmark has put it, “Trust is the new black,” and it never goes out of style.
Social apps of today need to understand this concept of trust and doing the right thing. Just because Apple doesn’t have a “permission” behavior imposed on the apps, doesn’t mean that the apps have to do it. Just look at Marco Arment’s Instapaper. It uses the address book access on the iPhone in a manner that is respectful of his customers, their privacy and yet balancing it with the needs to grow his business. Here is what he wrote:
When implementing these features, I felt like iOS had given me far too much access to Address Book without forcing a user prompt. It felt a bit dirty. Even though I was only accessing the data when a customer explicitly asked me to, I wanted to look at only what I needed to and get out of there as quickly as possible. I never even considered storing the data server-side or looking at more than I needed to. This, apparently, is not a common implementation courtesy.
Could he have grown faster had he followed the herd? Perhaps. What I am saying is that as the web becomes more social, we need to think about “people” first when designing software. Instead of asking for forgiveness, the app makers have to start by being explicit in what they do or don’t do.
As we look into the future, the web of today is not that of early adopters. It includes moms, grand-moms and others who may not be savvy about permissions and privacy. And that is why the imperative is on the app developers to do the right thing. Is it hard? Perhaps. Will it slowdown the growth of some of these companies? Maybe, maybe not — we won’t know until someone tries it. However, building trust and being upfront about how apps work and how our data is consumed and relates and impacts us can’t necessarily be a bad thing.