5 Comments

Summary:

Crowdsourcing is a great idea in theory, but it hasn’t really lived up to its potential, says Alegion CEO Nathaniel Gates. He thinks with some fundamental tweaks, Alegion can change all that for the better.

There’s man and there’s machine. And then there’s Alegion. The Mill Creek, Wash.-based startup launched at Amazon Web Services’ Re: Invent conference in late November with a business model that aims to turn crowdsourcing labor into the perfect blend of human-machine interaction.

Crowdsourcing hasn’t really taken off, Alegion Co-founder and CEO Nathaniel Gates told me, because it’s too complex to do correctly. This is especially true on Amazon’s Mechanical Turk platform, atop which Alegion has built its business. Conceptually, he thinks it’s amazing, but the answer to the question of “Why hasn’t it taken off, why hasn’t it revolutionized labor?” is that compared with the rest of the AWS platform, Mechanical Turk is a bear to use.

alegion logoThere’s a perception that the results will be inaccurate, and without knowing how to ensure accurate work, many companies just stay away. However, Gates said, the problem isn’t with the platform or the workers, but with the process. When companies don’t like the results of their crowdsourcing efforts on Mechanical Turk, he said, it’s because “they’re not asking the question correctly and they’re not using best practices to ensure accuracy.”

This is where Alegion thinks it can make a difference. The company will work processes (such as “gold answers”) into the workflow to ensure that work is accurate, as well as a method for scoring workers so users can build a workforce that’s ideal for a given task. Alegion also lets users see all the data related to their jobs, Gates said, which means they can run their own analyses to figure out who worked best, what went wrong or anything else they care to discover.

However, anyone paying attention to crowdsourcing has probably heard much of this before. In 2011, we covered a company called CrowdControl  (now called CrowdComputing Systems) that provides a worker-management platform and uses artificial intelligence methods to ensure workers are performing their tasks accurately. And then there are companies such as CrowdFlower and Utest, which have built their own user-friendly crowdsourcing platforms and appear to be doing fine business.

What really struck me about Gates’s description of Alegion, though, is its focus on treating the people behind Mechanical Turk as people rather than as fleshy computing resources (something I might have suggested in the past). Although there’s no corporate relationship via the HR department, there’s still an interpersonal relationship that can affect morale and, therefore, quality and efficiency.

That’s why Alegion’s platform lets job creators interact with workers, answer their questions, provide bonuses and even segment workers into those who excel and those who might need a little help. People want “satisfaction and appreciation in what they do,” Gates said.

But, he added, “If you treat it as a system — data in, data out — you won’t be happy with your results.”

You’re subscribed! If you like, you can update your settings

  1. > This is where Alegion thinks it can make a difference. The company will work processes (such as “gold answers”) into the workflow to ensure that work is accurate, as well as a method for scoring workers so users can build a workforce that’s ideal for a given task.

    OK I reviewed website including http://alegion.com/index.php/our-platform but am not seeing that difference

  2. Shell At Cloudmebaby Tuesday, December 11, 2012

    Using “gold answers” and custom qualifications is what long term “turkers” have been saying for years. This little added work does make a difference in the results received. Also, a little bit of communication with “turkers” goes quite a long way. Those that take the time to speak with “turkers” (through emails and/or on turker forums) are able to explain more in depth what they are looking for. In addition the communication gives that little bit of a human touch that makes the workers want to complete more HITs for those requesters.

  3. Curious if this improves the creation of training data for human intelligence tasks as the categorization of text or labeling of images. Previous research has shown that crowdsourced tasks which seemed simple on the surface turned out to be challenging to design and execute and the results were mediocre.

  4. Terri Griffith Monday, January 21, 2013

    Great to see the discussion going well beyond the technology to the human & organizational practices.

  5. Chaoliang Colin Gu Sunday, January 27, 2013

    I don’t think all the crowdsource tasks will have “gold answers”.

Comments have been disabled for this post