Please use this identifier to cite or link to this item:
|New Metrics to Encourage Innovation and Diversity in Information Retrieval Approaches
Information retrieval approach
State of the art
|Springer Science and Business Media Deutschland GmbH
|In evaluation campaigns, participants often explore variations of popular, state-of-the-art baselines as a low-risk strategy to achieve competitive results. While effective, this can lead to local “hill climbing” rather than a more radical and innovative departure from standard methods. Moreover, if many participants build on similar baselines, the overall diversity of approaches considered may be limited. In this work, we propose a new class of IR evaluation metrics intended to promote greater diversity of approaches in evaluation campaigns. Whereas traditional IR metrics focus on user experience, our two “innovation” metrics instead reward exploration of more divergent, higher-risk strategies finding relevant documents missed by other systems. Experiments on four TREC collections show that our metrics do change system rankings by rewarding systems that find such rare, relevant documents. This result is further supported by a controlled, synthetic data experiment, and a qualitative analysis. In addition, we show that our metrics achieve higher evaluation stability and discriminative power than the standard metrics we modify. To support reproducibility, we share our source code. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
|45th European Conference on Information Retrieval, ECIR 2023 -- 2 April 2023 through 6 April 2023 -- 292029
|Appears in Collections:
|Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Show full item record
checked on Feb 19, 2024
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.