Call for Reproducibility Track Papers

The Reproducibility Papers Track solicits papers that repeat, reproduce, generalize, and analyze prior work with a strong impact on information retrieval. The focus of this track is on generating new findings of established approaches akin to a test of time. Submitted papers should analyze to which extent assumptions of the original work held up, and elaborate error modes and unexpected conclusions.

We are particularly interested in reproducibility papers (different team, different experimental setup) rather than replicability papers (different team, same experimental setup). The emphasis is not on reproducibility badging, but on generating new research insights with existing approaches.

Reproducibility papers are considered equal to regular SIGIR papers. Accepted reproducibility papers will appear in the SIGIR 2022 main conference proceedings.

Important dates

Time zone: Anywhere on Earth (AoE)

Reproducibility track papers due: Monday, February 14, 2022

Reproducibility track papers notifications: Thursday, March 31, 2022

Submission Guidelines

Reproducibility track papers are expected to help establish whether prior research in IR is generalizable beyond the theoretical or experimental settings that the paper(s) being reproduced assume(s). Submissions are welcome on reproducibility in any area in IR. Papers submitted to the Reproducibility paper track must explain:

  1. Their motivation for selecting the methods that are replicated or reproduced and the impact of these methods on the IR community;
  2. The directions in which they try to generalize, chose different angles from the original work that they reproduce, and the experimental setup(s) they select to support their research in these new directions;
  3. The assumptions of the original work that they found to hold up, and the ones that could not be confirmed. For papers in the reproducibility track the key is to share knowledge about what lessons from prior work held up.

We expect authors to provide a comprehensive online appendix, with code, data, and clear instructions for reproduction. The review process is single-blind so that personal or institutional repositories can be used for the submission.

Reviewing Criteria

A dedicated reviewing committee will review submitted reproducibility track papers. The reviewing will be as rigorous as for regular SIGIR papers. However, the reviewing criteria will be different from the full paper track, with a strong emphasis on lessons learned, limitations, and generalizability.

Submissions will be evaluated using the following criteria:

  1. Contribution: Does this work provide a novel angle on existing approaches and thereby lead to novel insights for the IR community?
    1. To a small extent
    2. To a moderate extent
    3. To a good extent
    4. To a great extent
  2. Motivation: How relevant is the replicated or reproduced work for the IR community, and how impactful are the achieved conclusions?
    1. To a small extent
    2. To a moderate extent
    3. To a good extent
    4. To a great extent
  3. Soundness: Is the replicated or reproduced paper sufficiently solid in terms of methodology and evaluation?
    1. To a small extent
    2. To a moderate extent
    3. To a good extent
    4. To a great extent
  4. Quality of reproduction artifacts: Do the supplementary materials for this submission support ease of reproducibility?
    1. To a small extent
    2. To a moderate extent
    3. To a good extent
    4. To a great extent
  5. Paper length: Is the contribution appropriate for the given length?
    1. The paper is too long for the contribution that it makes
    2. The length is just about right
    3. The paper is too dense
  6. Overall evaluation (Summary of the final decision)
    1. Main strengths
    2. Main weaknesses
  7. Summary: Does it hold up? In which way does this paper confirm the reviewer’s prior beliefs or provide new insights into what works and what doesn’t? What new insight did you gain?
  8. Overall score:
    1. Reject [-2]
    2. Weak reject [-1]
    3. Borderline [0]
    4. Weak accept [1]
    5. Accept [2]
  9. Reviewer confidence:
    1. None
    2. Low
    3. Medium
    4. High
    5. Expert

Submissions of reproducibility track papers must be written in English. The page length limit is flexible with up to 9 pages (including figures, tables, and proofs), plus unrestricted space for references. Paper length should be commensurate with contribution size and authors should submit a paper whose length reflects what is needed for the content of the work. We invite reproducibility studies that were conducted within research for a full paper submission; in that case, please respect the anonymity requirement of the regular submission and note the necessity to submit a complete paper with its own distinct message and contribution.

The submission should be in PDF, using the current ACM two-column conference format. Suitable LaTeX, Word, and Overleaf templates are available from the ACM web site (use the the acmart.cls style file and set “sigconf=true,review=true” as options).

Submissions should be submitted electronically via EasyChair. Note that the reproducibility papers track must be selected in the submission form. The submission site is:

At least one author of an accepted paper is required to register for and present the work at the conference.

Anonymity and authorship

The review process for reproducibility track papers is single blind, so that the reviewers know the identity of the author(s). In this manner, personal or institutional repositories can be used for the submission. Authors should carefully go through ACM’s authorship policy before submitting a paper. To identify reviewers with conflict of interests (COIs), we request the complete author list must be specified at submission time. No changes to authorship will be permitted after the submission deadline under any circumstance.

Reproducibility Track Chairs

  • Laura Dietz, University of New Hampshire
  • Maarten de Rijke, University of Amsterdam


For further information, please contact the SIGIR 2022 Reproducibility Track Co-chairs by email to sigir22-reproducibility AT easychair DOT org