Over the last week or so, I've written about several pain points for requesters using Amazon's mechanical turk. Most of the come down to issues of trust, learning, and communication -- very human problems.
I speculate that the problem is one of philosophy. The design and advertising of the system both suggest that the designers think of turkers as interchangeable computation nodes -- they certainly treat them that way. The market was not designed to take into account differences in skills among workers, strategizing by workers, or relationships between workers and requesters. It was built --- intentionally --- to be faceless*.
Essentially, Amazon intended mturk to be like the EC2 spot instance market for human piece work: cheap computation on demand. An API to human intelligence. It's a clever, groundbreaking idea, but the analogy only holds to a point. Unlike instances of cloud machines, turkers are human beings. They have different skills and interests. They learn as they go along. They sometimes cheat if you let them. Treating them as faceless automata only works for the simplest of tasks.
A better, more human approach to crowdsourcing would acknowledge that people are different. It would take seriously the problems of motivation, ability, and trust that arise in work relationships. Providing tools for training, evaluation, and communication---plus fixing a few spots where the market is broken---would be a good start.
Let me finish by waving the entrepreneurial banner. I'm convinced there's a huge opportunity here, and I'm not alone. Mturk is version 1.0 of online crowdsourcing, but it would be silly to argue that crowdsourcing will never get better**. What's next? Do you want to work on it together?
* There's a whole ecosystem of companies and tools around mturk (e.g. Crowdflower, Houdini). I haven't explored this space thoroughly, but my read is that they're pretty happy with the way mturk is run. They like the facelessness of it. Even Panos Ipeirotis, whose work I like, seems to have missed a lot of these points -- he focuses on things like scale, API simplicity, and accuracy. Maybe I'm missing out on the bright new stars, though. Do you know of teams that have done more of the humanizing that I'm arguing for?
** Circa 530,000,000 BC: "Behold, the lungfish! The last word in land-going animal life!"
There is a recent article in the Economist that talks about Human computing.
ReplyDeleteIt gives a few examples of companies that try to improve on Amazon, like Castingwords, which calls its workers "Turkers 2.0" or CrowdForge a software to facilitate crowdsourcing.