Much of last week’s buzz surrounding the launch of Color was justifiably skeptical. The startup, after all, raised $41 million to enter a crowded space without a business model or customers, and many wonder whether the world really needs another mobile photo-sharing app. But two components of Color’s vision — implicit networks (connections created without user effort) and place/time tagging — extend far beyond photo-sharing, and make the company worth watching as a potential indicator of social media and data-mining trends.
The Color app for iPhones and Android lets users share photos in real time with other nearby photo-snappers. The sharing network is determined by proximity rather than by a user explicitly specifying who his friends are. Users are anonymous and all content is public.
Early reviews are pretty negative. Om writes that Color is attracting more attention from pundits than users because the app may not deliver obvious fun or utility. Matthew Ingram wonders if the big funding bet is on Color’s all-star team — which includes Bill Nguyen (Lala), Peter Pham (BillShrink) and former LinkedIn chief scientist DJ Patil — rather than its product or ideas.
But some of those ideas matter.
Angel investor and Hunch co-founder Chris Dixon says he’s intrigued by Color because it is pushing the envelope on implicit social graphs. Color’s implicit networks aren’t specified by users, but rather are based on underlying contexts like geography or shared interests. I’ve written before about context-based social networks, and how Facebook Groups is struggling to deliver them. Peter Yared, a VP at WebTrends, writes that Facebook is also experimenting with implicit neworks of friends.
If Color builds on its implict network concept it could deliver instant groups of friends for different occasions or interests, and expose recommendations based on common tastes. Marketers could target advertising or offers within a Color network to real-time groups around an event or location, or aimed by shared interests.
Place and Time Data
Search pundit John Battelle goes a little overboard on how Color could push augmented reality. But he’s right about the importance of geo-tagged data. In a presentation last week at GigaOM’s Structure Big Data 2011 conference, IBM Distinguished Engineer Jeff Jonas showed how adding place and time to data objects can power big data analysis, predicting a person’s likelihood of being at a give location with astounding accuracy, and assisting in identity management. Again, if Color is a leader in gathering this data, it could build out a powerful — yet still privacy-protected — targeted advertising network.
Business Model to Come?
Color chief Nguyen says the company is really about data-mining rather than photo-sharing. He says combining place and time data with implicit networks can help services or marketers parse the difference between entertainment and work activities. That information will affect the elasticity of Color’s networks — how broadly it expands or contracts its sharing range — and power its algorithms for ranking photos and, presumably, other content or advertising elements.
Nguyen also talks about a future news API that could spawn a curated news app for journalists. He describes a pretty dumb restaurant service that would help waitstaff know customers’ first names and interests. Before he sold Lala to Apple, reportedly for $85 million, Nguyen took the service through at least three different business models. Lala started as a CD trading service, morphed to a digital music locker, and then offered Web songs with perpetual streaming rights for ten cents each. With its talent and cash hoard, there’s no doubt Color will evolve as well.