repro_eval: A Python Interface to Reproducibility Measures of System-Oriented IR Experiments

Timo Breuer*, Nicola Ferro, Maria Maistro, Philipp Schaer

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

13 Citations (Scopus)
15 Downloads (Pure)

Abstract

In this work we introduce repro_eval - a tool for reactive reproducibility studies of system-oriented Information Retrieval (IR) experiments. The corresponding Python package provides IR researchers with measures for different levels of reproduction when evaluating their systems’ outputs. By offering an easily extensible interface, we hope to stimulate common practices when conducting a reproducibility study of system-oriented IR experiments.

Original languageEnglish
Title of host publicationAdvances in Information Retrieval - 43rd European Conference on IR Research, ECIR 2021, Proceedings
EditorsDjoerd Hiemstra, Marie-Francine Moens, Josiane Mothe, Raffaele Perego, Martin Potthast, Fabrizio Sebastiani
Number of pages6
PublisherSpringer
Publication date2021
Pages481-486
ISBN (Print)9783030722395
DOIs
Publication statusPublished - 2021
Event43rd European Conference on Information Retrieval, ECIR 2021 - Virtual, Online
Duration: 28 Mar 20211 Apr 2021

Conference

Conference43rd European Conference on Information Retrieval, ECIR 2021
CityVirtual, Online
Period28/03/202101/04/2021
SeriesLecture Notes in Computer Science
Volume12657 LNCS
ISSN0302-9743

Bibliographical note

Publisher Copyright:
© 2021, Springer Nature Switzerland AG.

Keywords

  • Evaluation
  • Replicability
  • Reproducibility

Cite this