The annual SIGIR conference is the major international forum for the presentation of new research results and the demonstration of new systems and techniques in the broad field of information retrieval (IR). The 46th ACM SIGIR conference, will be run as a hybrid conference – in person in Taiwan, with support for remote participation, from July 23rd to 27th, 2023. We cordially invite all those working in areas related to IR to submit reproducibility papers.

The Reproducibility Papers Track solicits papers that repeat, reproduce, generalize, and analyze prior work with a strong impact on information retrieval. The focus of this track is on generating new findings of established approaches akin to a test of time. Submitted papers should analyze to which extent assumptions of the original work held up, and elaborate error modes and unexpected conclusions.

We are particularly interested in reproducibility papers (different team, different experimental setup) rather than replicability papers (different team, same experimental setup). The emphasis is not on reproducibility badging, but on generating new research insights with existing approaches.

Reproducibility papers are considered equal to regular SIGIR papers. Accepted reproducibility papers will appear in the SIGIR 2023 main conference proceedings.

Important Dates

Time zone: Anywhere on Earth(AoE)
Reproducibility track papers due: February 21, 2023
Reproducibility track papers notifications: March 31, 2023
Camera Ready for accepted reproducibility papers: April 26, 2023

Submission Guidelines

Reproducibility track papers are expected to help establish whether prior research in IR is generalizable beyond the theoretical or experimental settings that the paper(s) being reproduced assume(s). Submissions are welcome on reproducibility in any area in IR. Papers submitted to the Reproducibility paper track must explain:

    • Their motivation for selecting the methods that are replicated or reproduced and the impact of these methods on the IR community;
    • The directions in which they try to generalize, chose different angles from the original work that they reproduce, and the experimental setup(s) they select to support their research in these new directions;
    • The assumptions of the original work that they found to hold up, and the ones that could not be confirmed. For papers in the reproducibility track the key is to share knowledge about what lessons from prior work held up.

We expect authors to provide a comprehensive online appendix, with the code they developed, the data, and clear instructions for reproduction. The review process is ~single~*double*-blind so that personal or institutional repositories ~can~*should not* be used for the submission.

Reviewing Criteria

A dedicated reviewing committee will review submitted reproducibility track papers. The reviewing will be as rigorous as for regular SIGIR papers. However, the reviewing criteria will be different from the full paper track, with a strong emphasis on lessons learned, limitations, and generalizability.

Submissions will be evaluated using the following criteria

    • Contribution: Does this work provide a novel angle on existing approaches and thereby lead to novel insights for the IR community?
      1. To a small extent
      2. To a moderate extent
      3. To a good extent
      4. To a great extent
    • Motivation: How relevant is the replicated or reproduced work for the IR community, and how impactful are the achieved conclusions?
      1. To a small extent
      2. To a moderate extent
      3. To a good extent
      4. To a great extent
    • Soundness: Is the replicated or reproduced paper sufficiently solid in terms of methodology and evaluation?
      1. To a small extent
      2. To a moderate extent
      3. To a good extent
      4. To a great extent
    • Quality of reproduction artifacts: Do the supplementary materials for this submission support ease of reproducibility?
      1. To a small extent
      2. To a moderate extent
      3. To a good extent
      4. To a great extent
    • Paper length: Is the contribution appropriate for the given length?
      1. The paper is too long for the contribution that it makes
      2. The length is just about right
      3. The paper is too dense
    • Overall evaluation (Summary of the final decision)
      1. Main strengths
      2. Main weaknesses
    • Summary: Does it hold up? In which way does this paper confirm the reviewer’s prior beliefs or provide new insights into what works and what doesn’t? What new insight did you gain?
    • Overall score:
      1. Reject [-2]
      2. Weak reject [-1]
      3. Borderline [0]
      4. Weak accept [1]
      5. Accept [2]
    • Reviewer confidence:
      1. None
      2. Low
      3. Medium
      4. High
      5. Expert


Submissions of reproducibility track papers must be written in English. The page length limit is flexible with up to 9 pages (including figures, tables, and proofs), plus unrestricted space for references. Paper length should be commensurate with contribution size and authors should submit a paper whose length reflects what is needed for the content of the work. We invite reproducibility studies that were conducted within research for a full paper submission; in that case, please respect the anonymity requirement of the regular submission and note the necessity to submit a complete paper with its own distinct message and contribution.

The submission should be in PDF, using the current ACM two-column conference format. Suitable LaTeX, Word, and Overleaf templates are available from the ACM website (use the acmart.cls style file and set “sigconf=true,review=true” as options)

Submissions should be submitted electronically via EasyChair. Note that the reproducibility papers track must be selected in the submission form. The submission site is:

At least one author of an accepted paper is required to register for and present the work at the conference.

Anonymity and Authorship

The reproducibility paper review process is double-blind. Authors are required to take all reasonable steps to preserve the anonymity of their submission. The submission must not include author information and must not include citations or discussion of related work that would make the authorship apparent. However, it is acceptable to refer to companies or organizations that provided datasets, hosted experiments or deployed solutions if there is no implication that the authors are currently affiliated with these organizations. While authors can upload to institutional or other preprint repositories such as before reviewing is complete, we generally discourage this since it places anonymity at risk. If the paper is already on arXiv, please change the title and abstract so that it is not immediately obvious they are the same. Do not upload the paper to a preprint site after submission to SIGIR—wait until a review decision to avoid reviewers seeing the paper in daily digests or other places. Breaking anonymity puts the submission at risk of being desk rejected.

Link to codes is encouraged in reproducibility paper. However, simply sharing a Github link may have an anonymity violation risk, as it can contain committer information. The authors are suggested to take one of the following actions when sharing codes in their paper:

– Use an anonymization tool for Github.

– Use a cloud storage you can use, upload codes removing author information, and put a link to it.

Authors should carefully go through ACM’s authorship policy before submitting a paper. Please ensure that all authors are clearly identified in EasyChair before the submission deadline. To support the identification of reviewers with conflicts of interest, the full author list must be specified at submission time. No changes to authorship will be permitted for the camera-ready submission under any circumstance or after submissions close. So please, make sure you have them listed correctly when submissions close.

Reproducibility Paper Chairs

    • Gabriella Pasi, University of Milano-Bicocca
    • Jiafeng Guo, Institute of Computing Technology, CAS


For further information, please contact the SIGIR 2023 Reproducibility Co-chairs by email to