Evaluating the Crowd with Confidence

Manas Joglekar, Hector Garcia-Molina, Aditya Parameswaran

Worker quality control is a crucial aspect of crowdsourcing systems; typically occupying a large fraction of the time and money invested on crowdsourcing. In this work, we devise techniques to generate confidence intervals for worker error rate estimates, thereby enabling a better evaluation of worker quality. We show that our techniques generate correct confidence intervals on a range of real-world datasets, and demonstrate wide applicability by using them to evict poorly performing workers, and provide confidence intervals on the accuracy of the answers.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment