Does Crowdsourcing Commoditize Freelance Expertise?

Instead of hitting up friends and family, ask a crowd.

As crowdsourcing — the act of taking a job traditionally performed by a designated agent and outsourcing it to a large group of people — goes mainstream, there’s a heated debate going on about whether the practice commoditizes expertise. The most recent crowdsourcing trend is sites that curate crowds of freelance experts to attack complex tasks that require specialized knowledge. For example, uTest brings together software-testing experts, Local Motors works with car designers, InnoCentive matches scientists to research efforts and a number of companies organize graphic designers.

As a freelance expert, should you jump in, or are you ringing your own industry’s death knell?

Having built an expert crowdsourcing site for the past year, I’m convinced these sites don’t have to commoditize expertise. Quite the opposite, I think they can give it direct value.

Freelance experts’ hourly rate typically includes three basic transaction costs: customer acquisition, customer retention and work performed. Take freelance graphic designers, for example. First they need to find customers, which requires sales and marketing investment and time. Once they have customers, they have to maintain them, whether it’s answering the phone, reporting on progress or dealing with billing and collections. Sometimes very little of a freelancers’ time is devoted to doing real design work. Typically, these people became freelancers because they love design, not because they love managing overhead.

The new expert crowdsourcing sites eliminate most of the overhead costs by bringing customers and experts together and automating service, support and billing. The freelancers perform work and get paid directly for their expertise. While the hourly earnings may appear lower than traditional freelance work, once the transactional costs of being a freelancer are removed, the hourly fees for direct work can be about the same. The upside is that the time freelancers spend finding and managing clients can now be applied to actually doing the work they love.

The critical issue for freelancers is to find sites that allow them to earn more for their work than they could on their own. Some crowdsourcing sites are approaching this point, especially sites that offer a collaborative compensation model. One such example is uTest, where top software testers are earning as much as $5,000 per month — many while working primarily during evenings and weekends. uTest has had a number of testers from the U.S., the UK, Russia and India report that they are currently earning more from uTest than their full-time jobs as testers.

The collaborative model is an evolution of crowdsourcing incentive systems. In the early days of crowdsourcing, companies issued requests for submissions. Potentially, hundreds of participants could respond, but only the one or two “winners” collected any prize money. Considering all the work the “losers” did, this model heavily favored the project sponsor over the people doing the work. More recently, crowdsourcing companies have adjusted to models that pick multiple winners (with payment distributed among them) or have moved to collaborative models in which each contributor to a final result is compensated.

When uTest’s testing experts work on a new software release, they each get paid for every bug they find. Some find many bugs, some only a few. Chances are high that most earn something but those who do excellent work earn more. uTest is built upon a meritocratic reputation system: Testers who get rated highly by customers get paid more, get invited to more projects, and get paid more for their work. Conversely, testers who don’t satisfy customers earn a poor reputation, and don’t get invited to future projects. True expertise is rewarded.

The per-bug payout rates uTest experts earn can rival the effective earning rate they would make freelancing on their own. In this model, project sponsors win as well because they get exactly what they want: a comprehensive and collaborative testing result from many participants, and a payment system that is based solely on performance. With software, “more is better” applies, and uTest’s ability is to provide multiple test engineers provides better coverage than any one individual.

While it’s moving fast, the crowdsourcing industry is still young and the underlying models are evolving quickly. The industry is stratifying into two distinct types of crowds: curated crowds of experts and general crowds that enjoy participating. In the participation model, the crowd may accept limited or no monetary compensation as reward, participating is reward enough. For expert crowdsourcing sites, the long-term sustainability test is simple: “can a freelance expert make a better living being part of my crowd than on their own?”

I believe that expert crowdsourcing does not by definition commoditize expertise. In fact, if done right, it focuses everyone on what matters most: the results of expertise.

Niel Robertson is a three-time entrepreneur and CEO of Trada Paid Search, a crowdsourced paid search marketplace. You can find Niel on Twitter at @nielr1. He will be talking about “The Human Cloud: The Elastic Workforce in the Enterprise,” at our Net:Work conference in San Francisco on Dec. 9.

Related content from GigaOM Pro (sub. req.):

loading

Comments have been disabled for this post