Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11851/4038
Full metadata record
DC FieldValueLanguage
dc.contributor.authorRahman, M. M.-
dc.contributor.authorKutlu, Mücahid-
dc.contributor.authorElsayed, T.-
dc.contributor.authorLease, M.-
dc.date.accessioned2021-01-25T11:28:55Z-
dc.date.available2021-01-25T11:28:55Z-
dc.date.issued2020-09
dc.identifier.citationRahman, M. M., Kutlu, M., Elsayed, T., and Lease, M. (2020, September). Efficient test collection construction via active learning. In Proceedings of the 2020 ACM SIGIR on International Conference on Theory of Information Retrieval (pp. 177-184).en_US
dc.identifier.isbn978-145038067-6
dc.identifier.urihttps://hdl.handle.net/20.500.11851/4038-
dc.identifier.urihttps://dl.acm.org/doi/10.1145/3409256.3409837-
dc.description.abstractTo create a new IR test collection at low cost, it is valuable to carefully select which documents merit human relevance judgments. Shared task campaigns such as NIST TREC pool document rankings from many participating systems (and often interactive runs as well) in order to identify the most likely relevant documents for human judging. However, if one's primary goal is merely to build a test collection, it would be useful to be able to do so without needing to run an entire shared task. Toward this end, we investigate multiple active learning strategies which, without reliance on system rankings: 1) select which documents human assessors should judge; and 2) automatically classify the relevance of additional unjudged documents. To assess our approach, we report experiments on five TREC collections with varying scarcity of relevant documents. We report labeling accuracy achieved, as well as rank correlation when evaluating participant systems based upon these labels vs. full pool judgments. Results show the effectiveness of our approach, and we further analyze how varying relevance scarcity across collections impacts our findings. To support reproducibility and follow-on work, we have shared our code online\footnote\urlhttps://github.com/mdmustafizurrahman/ICTIR_AL_TestCollection_2020/. © 2020 ACM.en_US
dc.language.isoenen_US
dc.publisherAssociation for Computing Machineryen_US
dc.relation.ispartofICTIR 2020 - Proceedings of the 2020 ACM SIGIR International Conference on Theory of Information Retrievalen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectActive learningen_US
dc.subjectevaluationen_US
dc.subjectinformation retrievalen_US
dc.subjecttest collectionsen_US
dc.titleEfficient Test Collection Construction Via Active Learningen_US
dc.typeConference Objecten_US
dc.departmentFaculties, Faculty of Engineering, Department of Computer Engineeringen_US
dc.departmentFakülteler, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümütr_TR
dc.identifier.startpage177
dc.identifier.endpage184
dc.authorid0000-0002-5660-4992-
dc.identifier.scopus2-s2.0-85093118866en_US
dc.institutionauthorKutlu, Mücahid-
dc.identifier.doi10.1145/3409256.3409837-
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.relation.otherAcknowledgements. We thank the reviewers for their valuable feedback. This work is supported in part by the Qatar National Research Fund (grant # NPRP 7-1313-1-245), the Micron Foundation, Wipro, and by Good Systems5, a UT Austin Grand Challenge to develop responsible AI technologies. The statements made herein are solely the responsibility of the authors.en_US
item.openairetypeConference Object-
item.languageiso639-1en-
item.grantfulltextnone-
item.fulltextNo Fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
crisitem.author.dept02.3. Department of Computer Engineering-
Appears in Collections:Bilgisayar Mühendisliği Bölümü / Department of Computer Engineering
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
Show simple item record



CORE Recommender

Page view(s)

138
checked on Dec 23, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.