9 Comments

Summary:

For a very long time, technology industry’s future has been determined by the capabilities of tech’s building blocks – chips, memory, storage and networks. With the emergence of social and mobile, it seems technology’s future will be defined by we the people.

ipadusers2

Turns out it was OK for me to be unwell last week. It gave me enough time to ponder some of the major stories of the moment without being compelled to write about them. Whether it was Amazon’s outage, Sony’s network breach or the drama around Apple’s location data collection policies (or lack thereof) — the hue and cry was quite astonishing. I mean, even South Park and Stephen Colbert had to weigh in on Apple’s location problems!

Not that it is the first time privacy, security and reliability have been the subject of hot debate. Skype outages, Gmail failures, Facebook going on the blink, and the brouhaha around Facebook’s dreaded Beacon project are some of the media explosions that crossed over from the realm of inward looking tech-media to mainstream media outlets. And I am pretty sure we will keep encountering more of these issues.

Stepping back, when you look at all these instances, you see that in all of them, the common thread was “we the people.” Our fears, our desires and our needs were behind the huge outcry around these problems that seemed to impact millions of us. This mainstreaming of technology has opened up new opportunities, but it has and will pose a brand new kind of challenge to companies in Silicon Valley.

If the hue and cry over Apple’s location data collection methodologies is any indication, then are we the people becoming the limiting factor in the evolution of technology and its adoption? Will the idea of what computing can do and what it will be in the future be limited by our collective ability to grok these changes? I mean, things aren’t exactly getting less complicated. Apple CEO Steve Jobs put it well when he said:

…as new technology comes into the society there is a period of adjustment and education. We haven’t as an industry done a very good job educating people, I think, as to some of the more subtle things going on here. As such, (people) jumped to a lot of wrong conclusions in the last week. I think the right time to educate people is when there is no problem. I think we will probably ask ourselves how we can do some of that, as an industry.

Jobs’ comments, in fact encapsulate the bigger issues at play.

The Tech Side Story

For the longest time, the future of technology has been determined by the building blocks that go into devices. The processors, memory and storage defined how software was written during the PC era. Of course, since big buyers of PCs were large companies, the average consumer didn’t have much say in the matter, though ironically the productivity-driven PC revolution was labeled the personal computer revolution.

The early days of the commercial Internet carried this grand tradition — ignoring the actual user of a product in favor of the commercial desires of a service provider. You, mom and I had little or no say in how the technologies were used to build a service. However, as the networks spread, the early years of the new century saw a massive consumerization of technology.

Corporations stopped buying, but consumers didn’t. Broadband connections grew. Sales of iPods and computers shot up and we all wanted digital cameras, video cameras and TiVos. As a result, the balance of power shifted away from companies and technology started to become more personal. However, the first five years of the 21st century were all about objects and how to make them simpler.

With the arrival of the social web, companies like Flickr, Plaxo and LinkedIn showed that it was time to think differently about how technologies (web applications) were built, used and, more importantly, adopted. The emergence of Facebook and Twitter has only amplified that effect.

The social web is about connecting people. On some networks it is real people (Facebook) and on some networks the web connects assumed identities (Twitter). If CPU, memory and storage defined the capabilities in the PC era, then in the Internet era, we saw software being defined by processors, memory, storage and bandwidth.

The Internet era eased the way to the social web. In the social web, the software and with it the frontiers of technology being defined by the marriage of network connectivity, PC-era staples and social identity.

There isn’t really social software, social media or even social networks. What social represents is a new way of thinking about what is as old as us –humans — relationships. Think of this way — the social web mimics the way we are in the real world.

Friends, families, tribes and teams that communicate, collaborate, consume and create together. There is no client, no server, just us. The CPU, memory, storage and the network are mere enablers. In this new kind of social web, the defining characteristic is us.

So when Facebook goes down, we cry bloody murder. If Facebook launches Beacon or tweaks how stuff shows up in our news streams or if we suddenly become shills to our friends — we all stomp our collective feet on the ground, till it starts to shake. In other words, it is not what Facebook can build or how it can use the technology resources. Instead the limiting factor for Facebook is how its 600 million (and growing) people adapt to the changes.

My Mobile, My Way

Over past four years, in parallel to this social web arranged around people and their networks, we have seen the emergence of a new kind of mobile Internet. Smartphones mean that technology that once was the domain of the office is now a constant accompaniment. Mobile phones of today might have innards of a PC, but they are not really computers. They are able to sense things, they react to touch and sound and location. Mobile phones are not computers, but they are an extension of us.

With this revolution, it has become easier to share our moments and other details of our life that have so far been less exposed. The sharing of location data becomes a cause of concern because it is the unknown. The situation is only going to get more complicated — we are after all entering a brave new world of sensor driven mobile experiences, as I wrote in an earlier newsletter. No, this is not science fiction stuff.

Today, I read about State Farm, the insurance company launching a new app that uses iPhone’s built in accelerometer technology in tandem with GPS-based location data to measure your ability to do three major tasks when driving — accelerate, brake and corner. You get a score at the end of the journey.

Of course, for now the results are private to the driver, but what if an insurance company started to keep records of your driving and decided your insurance rates based on your performance. It is great news if you are a good driver, not so good news if you are a horrible one. In other words, the perceived scariness is going to define how we adopt and adapt to this and more such technologies.

While Jobs’ idea of offering better explanations of complex technologies is a good step forward, companies also have to start thinking about the human aspect of their core products. In addition to the core building blocks, a product of tomorrow needs to know its human limits or its human capabilities.

Photo courtesy of Apple via its new iPad 2 clip. 

Uh oh!

Something is wrong with your Wufoo shortcode. If you copy and paste it from the Wufoo Code Manager, you should be golden.

  1. Apple isn’t the best company to offer better explanations of complex technologies, since it’s a very secretive and close.

    Share
    1. Tim,

      I don’t think this is just Apple’s issue. It is an issue for the tech industry at large. Not to defend Apple here, but they do tend to hide the complex and make complex very digestible. And as the current brouhaha shows, even they get it wrong

      Share
  2. We need education and choice. Education so we can make a choice, choice so we can decide for us what’s the best/advantage and the cost associated with a decision.
    And no turning the GPS off, if one declines sending data is not choice. Getting delayed data and a worse approximation if I decline, is. It’s not up to a programmer to decide what’s best for me, that’s PC thinking which comes out out of the corporate world.
    Education is also necessary to decide if I want the advanced capabilities of new Software, context is all about the sensors and data integration, it should be able to provide me help and new experiences. Up to me to decide when and how much, again it’s not some boolean choice made by some know it all programmer.

    I personally took up Android programming since all the programs out there do not provide the integration I want. Then one finds Context as a content object. Well that’s better than Wave’s, enum. But…

    Share
    1. Well said Ronald. And here in lies the challenge – education, choice and simplicity — quite hard concepts to grok for an industry that thinks in terms of components.

      Share
  3. At the end of the day, it actually boils down to the company proactively taking the most ethical stand in terms of protecting their consumers.

    It would be crazy to assume that the consumer should always protect themselves, after all, they pay us for our products.

    Hence, more than human limits and human capabilities, it is about the product understanding it’s expectations as a human.

    This becomes even more and more relevant, as we see more and more robots entering the product arena.

    Share
  4. stopped reading as soon as i saw the term ‘unwell’ being used in an apparently serious manner.

    Share
    1. What is wrong with the word “unwell” it has been around since 1450s, and it is perfectly fine. Unless you are objecting that a male writer is using a word originally defined for “the period of the month that all childbearing age women suffer major discomfort.” (Mind you that amused me the most when Om used that term.. I just couldn’t imagine it being “OK.” *chuckle*)

      Anyways, in terms of the article I pretty much agree. We jump so quickly to new features and to new technologies that rarely do enough people actively stop and go, “What other impact could this have on me?” Sadly, it doesn’t take a PhD to see most of the other impact to privacy, etc. Just you have two groups. The masses that blindly don’t care or want to care, or the extremist that jump on every new thing being “evil.”

      When the whole “your iPhone is tracking you” articles started appearing on Mac Observer I pulled up the data, evaluated it, and pointed out that the data in question had to be cell phone towers as it was to regular and consistent to be GPS log paths. And I was ignored as people wanted to jump on the “***k Apple” bandwagon. When I also pointed out Apple had been doing this for Wifi for years people continued to ignore me.

      *shrug* I think the major pink elephant in the room is that people don’t care to be educated. They’d rather the facts fit their own view of the universe, and any facts that don’t are discarded or marginalized until they are sitting on top of their chest threatening to suffocate them.

      Share
  5. The problem is not education, and it is not too much, or too little choice. The problem is a much more basic failing of human cognition.

    A study on how we asses risk was done where people were presented with a “revolutionary new source of energy” was presented. The risks presented were actually the risk assessment for Natural Gas. Over 70% of respondents said that the new energy source was too dangerous to be widely deployed.

    Risks that are close to home are exaggerated, while distant ones are discounted. Risks associated with known quantities ( natural gas ) are easier to tolerate than unknowns ( New energy source ).

    Location risks are discounted until “OMG you mean MY phone?”. You cannot educate an indifferent audience, and you can not ever expect a corporation to behave ethically in these matters. The best you can possibly do is to ensure that the resources needed to understand the technology are available for those who are not indifferent, and that the companies operate transparently enough to identify malfeasance. We do a lousy job of both.

    It’s not that they are inherently evil, but again, corporate structures play into the gaps in human cognition, encouraging otherwise good people to make unethical decisions because “it’s business”, or because “my boss told me to” and a host of other responses that distance leaders and rank and file from the real ethical and personal consequences of their choices. We have all be deeply indoctrinated with the idea that Facebook is evil, or Blackwater is Evil, but that the people in them are not. This disconnection between the people that comprise a company, and the company itself is both dangerous and inappropriate. Evil, or to be less inflammatory, unethical companies are the products of unethical leaders – full stop. Leadership that institute unethical policies are acting in an unethical manner, and are thus unethical themselves. Companies that behave in a criminal manner are – by definition – led by criminals. The problem being, there are few levers that can move a corporation, and the corporate veil assures that the leaders that, in most cases, the men who ordered those policies cannot be called to account for their actions.

    Facebook/Google/Whomever can never be relied up to behave in an ethical or forthright manner, simply because the interests of the company, and it’s leadership ( responsible to their employees ), and the employees ( need this job ) run counter to maintaining the highest possible standards of privacy. Strong ethical stands by leadership or individuals are possible, and can keep a company from going too far astray, but again, cannot be relied upon. Even stand up guys like Brin and Page can be led astray. There are now over 30,000 people marching behind their banners – with whom does their ethical responsibility lie? Their customers, or their employees?

    Share
  6. The real problem is that it keeps surprising the technology industry that there is such a thing as the human factor. That after a period of “hard innovation” there is a phase of “soft innovation” in which people/society adopts and morphs the technology to their needs. Or rejects it.

    In scientific terms it’s the schism between technological determinism and social constructivism. In philosophical terms it’s the positivist mindset of technologists that hasn’t caught up to post-modernity.

    We need more Jan Chipchase’s (Nokia), more Genevieve Bell’s (Intel), more Paul Dourish’s (Uni of California). Anthropologists who understand this, and help bust the naive myths and ideologies that pervade this industry.

    Share

Comments have been disabled for this post