ESOP, FASE and FoSSaCS Artifact Evaluation

Information on submission and evaluation of artifacts for the ESOP, FASE and FoSSaCS conferences.

Call for Artifacts

ESOP, FASE and FoSSaCS 2026 will have a joint post-paper-acceptance voluntary artifact evaluation. The outcome will not alter the paper acceptance decision.

Important Dates

  • Artifact submission deadline: January 8, 2026
  • Author response period: January 16–21, 2026
  • Notification to authors: February 12, 2026

Packaging

We encourage authors to read the HOWTO for AEC Submitters and the guidelines for submitting and reviewing proof artifacts.

To avoid installation problems (e.g., due to software dependencies) during artifact review, we require that authors provide their artifacts as a Docker/VM (OVF/OVA) images, or as a package for the TACAS’23 Artifact Evaluation Virtual Machine. If you think that your artifact needs to be submitted in a different format, please contact the AEC chairs.

The AEC members should be able to evaluate your artifact quickly. For longer runs, please try to show progress messages (e.g., completion percentage). If your artifact takes several days to run or requires special hardware, please contact the AEC chairs.

Please try to avoid downloading content over the internet during experiments or tests (to ensure self-containedness), as well as closed source software libraries, frameworks, operating systems, and container formats unless these are necessary for your submitted work. If possible, widely supported file formats should be used for the artifact (e.g., .zip or .tar.gz for archives, .odt or .pdf for documents, and CSV/JSON for data).

Every artifact submission must include:

  • a README main file consisting of two parts:

    • A Getting Started guide
    • Step-by-step instructions

    The Getting Started guide should contain an artifact description, installation instructions (if any), and a method to test the installation (a “smoke test”). This could be, for instance, a command to confirm that the code is installed and working, and its expected output. Reviewers should be able to complete the Getting Started guide within 30 minutes.

    The Step-by-step instructions should contain detailed reproduction steps for any experiments or activities supporting the paper conclusions. You should state all paper claims supported by the artifact (and how), as well as all paper claims not supported by the artifact (and why). Depending on your artifact’s nature, the instructions may differ:

    • Data artifacts should cover aspects relevant to understanding the context, data provenance, ethical and legal statements (if relevant), and storage requirements.
    • Software artifacts should contain instructions on how to reproduce the paper claims (e.g., tables/figures), and documentation on how to use the tool.
    • Proof artifacts should cover how the individual parts of the proof relate to formalisms presented in the paper.

    In addition to the scenarios above that reproduce the paper results, we encourage you to include further instructions on how the AEC can run the artifact on different experiments, as well as documentation on the artifact code and layout.

  • a REQUIREMENTS file covering the architecture in which your artifact was packaged (e.g., x86, ARM) and hardware/software requirements (e.g., storage or non-commodity peripherals, Docker, VM, and OS). We encourage you to also include machine-readable files describing dependencies (e.g., Dockerfile, Pipfile, dune-project), if relevant.

  • a STATUS file stating the badge(s) you are applying for, as well as a short justification why you think that the artifact deserves the respective badge(s)

  • a LICENSE file describing the terms of use and distribution rights

All files must be submitted as plain text (e.g., txt, md) or PDF within the artifact.

Submission

Submit your artifact through the ETAPS-hosted HotCRP system.

Please host your artifact URL in a platform that does not track IP addresses, so as to not undermine reviewer anonymity. We recommend Zenodo, FigShare, or Dryad.

Evaluation Phases

  • Kick-the-tires: Reviewers check the artifact’s packaging and integrity, and identify any possible setup problems that may prevent the artifact from being properly evaluated. After this initial kick-the-tires phase, there will be an author-response period where the authors and reviewers can freely and directly communicate via HotCRP to resolve any issues with the artifact. The goal of this phase is to ensure that the authors and reviewers can resolve any technical issues that would prevent the in-depth evaluation of the artifact.
  • Artifact assessment: Reviewers check whether the artifacts live up to the claims made in the accompanying documentation. Based on their reports and the ensuing AEC discussion, the artifact may be awarded one or two badges (see below).
  • Publication: Once the artifact has been evaluated, the authors are notified about the outcome, and the publication chairs are notified of any badges to be added to the paper.

Evaluation Criteria and Outcome

An artifact may be awarded one of the following badges, in accordance with the ETAPS artifact badging guidelines.

Functional: The artifact is found to be documented, consistent, complete, exercisable, and to include appropriate evidence of verification and validation.

Reusable: The artifact is of a quality that significantly exceeds minimal functionality. That is, it has all the qualities of the Functional level, but, in addition, it is very carefully documented and well-structured to the extent that reuse and repurposing is facilitated.

Irrespective of the artifact evaluation outcome, artifacts may be awarded the “Available” badge.

Available: Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI for the object is provided. Repositories used to archive data should have a declared plan to enable permanent accessibility (e.g., Zenodo, FigShare, or Dryad). Note that in order to award the Available badge, the same DOI needs to be presented both in the artifact evaluation and in the CR version of your paper. We recommend adding the DOI link in a dedicated data availability statement at the end of your paper.

Badge images

Artifacts that go beyond expectations of quality will receive a Distinguished Artifact award. The selection procedure will be based on review scores and feedback from the AEC.

Frequently Asked Questions

  • Why do an artifact evaluation?
    We want to encourage authors to provide more substantial evidence to their papers, and reward authors who create research artifacts. At the same time, we want to simplify the reproduction of results presented in the paper and ease future comparison with existing approaches. Hence, ESOP, FASE, and FoSSaCS offer an optional artifact evaluation for accepted papers. To ensure that the submitted artifacts provide the best possible value to the community, our goal is to be constructive and to improve the submitted artifacts.

  • What qualifies as an artifact?
    Artifacts include (but are not limited to) software, tools, frameworks, datasets, test suites, machine-checkable proofs, or any combination of the above. We will assess the artifacts themselves, and not the quality of the research that produced them (which has been assessed by the conference PC).

Artifact Evaluation Committee

AEC Chairs

AEC Members

to be announced