Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings   Proceeding details   Article details         Authors         Search    
Learning to Predict from Crowdsourced Data
Wei Bi, Liwei Wang, James Kwok, Zhuowen Tu
Abstract:
Crowdsourcing services like Amazonâ??s Mechan- ical Turk have facilitated and greatly expedited the manual labeling process from a large number of human workers. However, spammers are often unavoidable and the crowdsourced labels can be very noisy. In this paper, we explicitly account for four sources for a noisy crowdsourced label: workerâ??s dedication to the task, his/her expertise, his/her default labeling judgement, and sample difficulty. A novel mixture model is employed for worker annotations, which learns a prediction model directly from samples to labels for effi- cient out-of-sample testing. Experiments on both simulated and real-world crowdsourced data sets show that the proposed method achieves signifi- cant improvements over the state-of-the-art.
Keywords:
Pages: 82-91
PS Link:
PDF Link: /papers/14/p82-bi.pdf
BibTex:
@INPROCEEDINGS{Bi14,
AUTHOR = "Wei Bi and Liwei Wang and James Kwok and Zhuowen Tu",
TITLE = "Learning to Predict from Crowdsourced Data",
BOOKTITLE = "Proceedings of the Thirtieth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-14)",
PUBLISHER = "AUAI Press",
ADDRESS = "Corvallis, Oregon",
YEAR = "2014",
PAGES = "82--91"
}


hosted by DSL   •   site info   •   help