Abstract
Replicability and reproducibility of experimental results are primary concerns in all the areas of science and IR is not an exception. Besides the problem of moving the field towards more reproducible experimental practices and protocols, we also face a severe methodological issue: we do not have any means to assess when reproduced is reproduced. Moreover, we lack any reproducibility-oriented dataset, which would allow us to develop such methods. To address these issues, we compare several measures to objectively quantify to what extent we have replicated or reproduced a system-oriented IR experiment. These measures operate at different levels of granularity, from the fine-grained comparison of ranked lists, to the more general comparison of the obtained effects and significant differences. Moreover, we also develop a reproducibility-oriented dataset, which allows us to validate our measures and which can also be used to develop future measures.
Original language | English |
---|---|
Title of host publication | SIGIR '20 : Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval |
Number of pages | 10 |
Publisher | Association for Computing Machinery |
Publication date | 2020 |
Pages | 349-358 |
ISBN (Electronic) | 978-1-4503-8016-4 |
DOIs | |
Publication status | Published - 2020 |
Event | 43rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2020 - Virtual, Online, China Duration: 25 Jul 2020 → 30 Jul 2020 |
Conference
Conference | 43rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2020 |
---|---|
Country/Territory | China |
City | Virtual, Online |
Period | 25/07/2020 → 30/07/2020 |
Sponsor | ACM Special Interest Group on Information Retrieval (SIGIR) |
Bibliographical note
Publisher Copyright:© 2020 ACM.
Keywords
- measure
- replicability
- reproducibility