Embracing Error to Enable Rapid Crowdsourcing

02/14/2016
by   Ranjay Krishna, et al.
0

Microtask crowdsourcing has enabled dataset advances in social science and machine learning, but existing crowdsourcing schemes are too expensive to scale up with the expanding volume of data. To scale and widen the applicability of crowdsourcing, we present a technique that produces extremely rapid judgments for binary and categorical labels. Rather than punishing all errors, which causes workers to proceed slowly and deliberately, our technique speeds up workers' judgments to the point where errors are acceptable and even expected. We demonstrate that it is possible to rectify these errors by randomizing task order and modeling response latency. We evaluate our technique on a breadth of common labeling tasks such as image verification, word similarity, sentiment analysis and topic classification. Where prior work typically achieves a 0.25x to 1x speedup over fixed majority vote, our approach often achieves an order of magnitude (10x) speedup.

READ FULL TEXT

page 2

page 4

page 5

research
02/05/2023

Crowdsourcing Utilizing Subgroup Structure of Latent Factor Modeling

Crowdsourcing has emerged as an alternative solution for collecting larg...
research
01/11/2019

BUOCA: Budget-Optimized Crowd Worker Allocation

Due to concerns about human error in crowdsourcing, it is standard pract...
research
08/07/2017

T-Crowd: Effective Crowdsourcing for Tabular Data

Crowdsourcing employs human workers to solve computer-hard problems, suc...
research
08/31/2016

Dynamic Allocation of Crowd Contributions for Sentiment Analysis during the 2016 U.S. Presidential Election

Opinions about the 2016 U.S. Presidential Candidates have been expressed...
research
05/25/2016

Exact Exponent in Optimal Rates for Crowdsourcing

In many machine learning applications, crowdsourcing has become the prim...
research
10/23/2018

Working in Pairs: Understanding the Effects of Worker Interactions in Crowdwork

Crowdsourcing has gained popularity as a tool to harness human brain pow...
research
08/10/2011

Reputation-based Incentive Protocols in Crowdsourcing Applications

Crowdsourcing websites (e.g. Yahoo! Answers, Amazon Mechanical Turk, and...

Please sign up or login with your details

Forgot password? Click here to reset