On-demand expertise is a large part of the future of work, but getting that expertise can be a complicated process. However, organizations can overcome historical challenges by making use of services such as freelance marketplaces, crowdsourcing platforms, expert communities, and professional social networks. These services offer different approaches to access expertise in domains ranging from data science and design to CAD and copywriting.
All of these services face a unique challenge: Clients need to quickly and efficiently come to trust people they have never met. Reputation systems have emerged to enable this. The systems serve as lower-cost alternatives to long, trusted evaluation processes like interviews, reference checks, and trial projects. Reputation systems present clients with signals that are familiar from other types of service transactions like ratings, reviews, or scores.
Developing reputation systems is challenging. Although LinkedIn leads in terms of access to finding and connecting to professionals, the task of evaluation still falls to clients. But an expanding universe of smaller marketplaces, crowds, and communities provides not only connections but also evaluation in the form of reputation systems. Marketplaces like oDesk and Work Market offer access to freelancers, and they report expert ratings and reviews. Expert communities like Stack Overflow and Dribbble offer access to internal rankings of community members. Crowdsourcing platforms like uTest and Trada enable clients to delegate tasks to crowds, completely removing the need to evaluate individuals.
All of these organizations approach reputation in different ways, primarily because they have access to different data about the experts they are reporting on. Some can closely observe work that they are helping to manage. Others must interpret signals from social networks to determine correlation with expertise. As a result some reputation systems offer reliable predictors of performance while others cannot be depended on exclusively to make decisions.
Key points in this report are:
- Quantified work yields trusted reputation systems. The best reputation systems win the trust of experts and clients. And the trusted systems have one thing in common: They are able to measure work deliverables or responses to work deliverables as well as any number of intermediate work-related activities.
- Social signals matter more for teams. In many cases experts are sought to complete tasks with little or no interaction with clients. But when experts are being sought to work closely with internal teams, social signals like the number of peers or types of interactions offer important clues to how an outside expert might be able to work with a team, more typically on open-ended or less-structured tasks.
- Marketplaces and crowds lead in quantifying work. Marketplaces and expert crowds closely observe how experts work, and they also evaluate the results of their work. Experts interact with complex platforms that do everything from scoring submitted work to measuring levels of activity among clients and experts.
- Expert communities are quantifying aspects of work too. Expert communities give detailed signals from other experts as they respond to product- or work-related activities. As they begin to connect clients to experts, they will also be able to access feedback from clients just like crowds and marketplaces. They also observe social interactions, which can have value in specific on-demand situations.
The above image is a map of how different services are generating and sourcing signals to develop expert reputations. Most of the services evaluate data generated on their own platforms, with the notable exception of scoring tools Klout and Gild, which look to harvest any and all available public signals.
In this report, we explore the different approaches to acquiring and analyzing the data that makes up these scores, particularly in software development and digital marketing.