Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11851/4030
Title: Annotator rationales for labeling tasks in crowdsourcing
Authors: Kutlu, Mücahid
McDonnell, T.
Elsayed, T. C.
Lease, M.
Keywords: Crowdsourcing 
 Turks 
 Task Assignment
Publisher: AI Access Foundation
Source: Kutlu, M., McDonnell, T., Lease, M., and Elsayed, T. (2020). Annotator Rationales for Labeling Tasks in Crowdsourcing. Journal of Artificial Intelligence Research, 69, 143-189.
Abstract: When collecting item ratings from human judges, it can be difficult to measure and enforce data quality due to task subjectivity and lack of transparency into how judges make each rating decision. To address this, we investigate asking judges to provide a specific form of rationale supporting each rating decision. We evaluate this approach on an information retrieval task in which human judges rate the relevance of Web pages for different search topics. Cost-benefit analysis over 10,000 judgments collected on Amazon’s Mechanical Turk suggests a win-win. Firstly, rationales yield a multitude of benefits: more reliable judgments, greater transparency for evaluating both human raters and their judgments, reduced need for expert gold, the opportunity for dual-supervision from ratings and rationales, and added value from the rationales themselves. Secondly, once experienced in the task, crowd workers provide rationales with almost no increase in task completion time. Consequently, we can realize the above benefits with minimal additional cost.
URI: https://hdl.handle.net/20.500.11851/4030
https://www.jair.org/index.php/jair/article/view/12012
ISSN: 10769757
Appears in Collections:Bilgisayar Mühendisliği Bölümü / Department of Computer Engineering
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection

Show full item record



CORE Recommender

SCOPUSTM   
Citations

1
checked on Apr 20, 2024

WEB OF SCIENCETM
Citations

10
checked on Jan 20, 2024

Page view(s)

128
checked on Apr 22, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.