Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11851/4030
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKutlu, Mücahid-
dc.contributor.authorMcDonnell, T.-
dc.contributor.authorElsayed, T. C.-
dc.contributor.authorLease, M.-
dc.date.accessioned2021-01-25T11:28:54Z-
dc.date.available2021-01-25T11:28:54Z-
dc.date.issued2020
dc.identifier.citationKutlu, M., McDonnell, T., Lease, M., and Elsayed, T. (2020). Annotator Rationales for Labeling Tasks in Crowdsourcing. Journal of Artificial Intelligence Research, 69, 143-189.en_US
dc.identifier.issn10769757
dc.identifier.urihttps://hdl.handle.net/20.500.11851/4030-
dc.identifier.urihttps://www.jair.org/index.php/jair/article/view/12012-
dc.description.abstractWhen collecting item ratings from human judges, it can be difficult to measure and enforce data quality due to task subjectivity and lack of transparency into how judges make each rating decision. To address this, we investigate asking judges to provide a specific form of rationale supporting each rating decision. We evaluate this approach on an information retrieval task in which human judges rate the relevance of Web pages for different search topics. Cost-benefit analysis over 10,000 judgments collected on Amazon’s Mechanical Turk suggests a win-win. Firstly, rationales yield a multitude of benefits: more reliable judgments, greater transparency for evaluating both human raters and their judgments, reduced need for expert gold, the opportunity for dual-supervision from ratings and rationales, and added value from the rationales themselves. Secondly, once experienced in the task, crowd workers provide rationales with almost no increase in task completion time. Consequently, we can realize the above benefits with minimal additional cost.en_US
dc.language.isoenen_US
dc.publisherAI Access Foundationen_US
dc.relation.ispartofJournal of Artificial Intelligence Researchen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectCrowdsourcing en_US
dc.subject Turks en_US
dc.subject Task Assignmenten_US
dc.titleAnnotator rationales for labeling tasks in crowdsourcingen_US
dc.typeArticleen_US
dc.departmentFaculties, Faculty of Engineering, Department of Computer Engineeringen_US
dc.departmentFakülteler, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümütr_TR
dc.identifier.volume69
dc.identifier.startpage143
dc.identifier.endpage189
dc.authorid0000-0002-5660-4992-
dc.identifier.wosWOS:000606811900004en_US
dc.identifier.scopus2-s2.0-85091935817en_US
dc.institutionauthorKutlu, Mücahid-
dc.identifier.doi10.1613/jair.1.12012-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.relation.otherWe thank the many talented crowd workers whose annotations enabled our research. We also thank our reviewers, at both HCOMP and JAIR, for their helpful feedback and suggestions. This work was made possible by generous support from NPRP grant# NPRP 7-1313-1-245 from the Qatar National Research Fund (a member of Qatar Foundation), the National Science Foundation (grant No. 1253413), the Micron Foundation, and UT Austin’s Good Systems Grand Challenge Initiative to design a future of responsible AI (http://goodsystems.utexas.edu). Any opinions, findings, and conclusions or recommendations expressed by the authors are entirely their own and do not represent those of the sponsoring agencies.en_US
dc.identifier.scopusqualityQ2-
item.cerifentitytypePublications-
item.languageiso639-1en-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.openairetypeArticle-
item.fulltextNo Fulltext-
item.grantfulltextnone-
crisitem.author.dept02.3. Department of Computer Engineering-
Appears in Collections:Bilgisayar Mühendisliği Bölümü / Department of Computer Engineering
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

1
checked on Apr 13, 2024

WEB OF SCIENCETM
Citations

10
checked on Jan 20, 2024

Page view(s)

124
checked on Apr 15, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.