ESOP, FASE and FoSSaCS Artifact Evaluation

Information on submission and evaluation of artifacts for the ESOP, FASE and FoSSaCS conferences.

Call for Artifacts

ESOP, FASE and FoSSaCS 2025 will have a joint post-paper-acceptance voluntary artifact evaluation. The outcome will not alter the paper acceptance decision.

Important Dates

  • Artifact submission deadline: January 9th, 2025
  • Author response period: January 16th – 19th, 2025
  • Notification to authors: February 13th, 2025

Packaging

We encourage authors to read the HOWTO for AEC Submitters and the guidelines for submitting and reviewing proof artifacts.

To avoid installation problems (e.g., due to software dependencies) during artifact review, we require that authors provide their artifacts as a Docker/VM (OVF/OVA) images, or as a package for the TACAS’23 Artifact Evaluation Virtual Machine. If you think that your artifact needs to be submitted in a different format, please contact the AEC chairs.

The AEC members should be able to evaluate your artifact quickly. For longer runs, please try to show progress messages (e.g., completion percentage). If your artifact takes several days to run or requires special hardware, please contact the AEC chairs.

Please try to avoid downloading content over the internet during experiments or tests (to ensure self-containedness), as well as closed source software libraries, frameworks, operating systems, and container formats unless these are necessary for your submitted work. If possible, widely supported file formats should be used for the artifact (e.g., .zip or .tar.gz for archives, .odt or .pdf for documents, and CSV/JSON for data).

Every artifact submission must include:

  • a README main file consisting of two parts:

    • A Getting Started guide
    • Step-by-step instructions

    The Getting Started guide should contain an artifact description, installation instructions (if any), and a method to test the installation (a “smoke test”). This could be, for instance, a command to confirm that the code is installed and working, and its expected output. Reviewers should be able to complete the Getting Started guide within 30 minutes.

    The Step-by-step instructions should contain detailed reproduction steps for any experiments or activities supporting the paper conclusions. You should state all paper claims supported by the artifact (and how), as well as all paper claims not supported by the artifact (and why). Depending on your artifact’s nature, the instructions may differ:

    • Data artifacts should cover aspects relevant to understanding the context, data provenance, ethical and legal statements (if relevant), and storage requirements.
    • Software artifacts should contain instructions on how to reproduce the paper claims (e.g., tables/figures), and documentation on how to use the tool.
    • Proof artifacts should cover how the individual parts of the proof relate to formalisms presented in the paper.

    In addition to the scenarios above that reproduce the paper results, we encourage you to include further instructions on how the AEC can run the artifact on different experiments, as well as documentation on the artifact code and layout. For ESOP artifacts only, these extra instructions can be submitted as part of an accompanying Experience Report.

  • a REQUIREMENTS file covering the architecture in which your artifact was packaged (e.g., x86, ARM) and hardware/software requirements (e.g., storage or non-commodity peripherals, Docker, VM, and OS). We encourage you to also include machine-readable files describing dependencies (e.g., Dockerfile, Pipfile, dune-project), if relevant.

  • a STATUS file stating the badge(s) you are applying for, as well as a short justification why you think that the artifact deserves the respective badge(s)

  • a LICENSE file describing the terms of use and distribution rights

All files must be submitted as plain text (e.g., txt, md) or PDF within the artifact.

Submission

Submit your artifact through the ETAPS-hosted HotCRP system.

Please host your artifact URL in a platform that does not track IP addresses, so as to not undermine reviewer anonymity.

Evaluation Phases

  • Kick-the-tires: Reviewers check the artifact’s packaging and integrity, and identify any possible setup problems that may prevent the artifact from being properly evaluated. Authors are given a 4-day period to address the identified issues by submitting an updated artifact link in their response text. The author response period will resemble paper rebuttals, i.e., there will be no interactive discussions. AEC members will phrase any issues encountered as concisely as possible, and authors are expected to address these issues in a single response.
  • Artifact assessment: Reviewers check whether the artifacts live up to the claims made in the accompanying documentation. Based on their reports and the ensuing AEC discussion, the artifact may be awarded one or two badges ( see below).
  • Publication: Once the artifact has been evaluated, the authors are notified about the outcome, and the publication chairs are notified of any badges to be added to the paper.

Evaluation Criteria and Outcome

An artifact may be awarded one of the following badges, in accordance with the ETAPS artifact badging guidelines.

Functional: The artifact is found to be documented, consistent, complete, exercisable, and to include appropriate evidence of verification and validation.

Reusable: The artifact is of a quality that significantly exceeds minimal functionality. That is, it has all the qualities of the Functional level, but, in addition, it is very carefully documented and well-structured to the extent that reuse and repurposing is facilitated.

Irrespective of the artifact evaluation outcome, artifacts may be awarded the “Available” badge.

Available: Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI for the object is provided. Repositories used to archive data should have a declared plan to enable permanent accessibility (e.g., Zenodo, FigShare, or Dryad). Note that in order to award the Available badge, the same DOI needs to be presented both in the artifact evaluation and in the CR version of your paper. We recommend adding the DOI link in a dedicated data availability statement at the end of your paper.

Badge images

Artifacts that go beyond expectations of quality will receive a Distinguished Artifact award. The selection procedure will be based on review scores and feedback from the AEC.

Frequently Asked Questions

  • Why do an artifact evaluation?
    We want to encourage authors to provide more substantial evidence to their papers, and reward authors who create research artifacts. At the same time, we want to simplify the reproduction of results presented in the paper and ease future comparison with existing approaches. Hence, ESOP, FASE, and FoSSaCS offer an optional artifact evaluation for accepted papers. To ensure that the submitted artifacts provide the best possible value to the community, our goal is to be constructive and to improve the submitted artifacts.

  • What qualifies as an artifact?
    Artifacts include (but are not limited to) software, tools, frameworks, datasets, test suites, machine-checkable proofs, or any combination of the above. We will assess the artifacts themselves, and not the quality of the research that produced them (which has been assessed by the conference PC).

Artifact Evaluation Committee

AEC Chairs

AEC Members

  • Alexandre Moine (NYU)
  • András Kovács (University of Gothenburg)
  • Andrea Colledan (University of Bologna)
  • Arnd Hartmanns (University of Twente)
  • Bernardo Almeida (LASIGE, University of Lisbon)
  • David Chocholatý (Brno University of Technology)
  • Gennaro Zanfardino (University of L’Aquila)
  • Giordano d’Aloisio (University of L’Aquila)
  • Gustavo Carvalho (Universidade Federal de Pernambuco)
  • Hongjian Jiang (RPTU)
  • Julia Sapiña (Universitat Politècnica de València)
  • Laura Bussi (ISTI – National Research Council)
  • Loïc Pujet (Stockholm University)
  • Loïc Germerie Guizouarn (Univ Rennes, CNRS, Inria, IRISA)
  • Lucas Sakizloglou (Brandenburg University of Technology)
  • Manolis Pitsikalis (NCSR Demokritos)
  • Michal Hečko (Brno University of Technology)
  • Michalis Kokologiannakis (ETH Zurich)
  • Noa Izsak (Ben-Gurion University of the Negev)
  • Ondrej Lengal (Brno University of Technology)
  • Pablo Gómez-Abajo (Universidad Autónoma de Madrid)
  • Raúl Gutiérrez (Universitat Politècnica de València)
  • Raúl López-Rueda (Universitat Politècnica de València)
  • Sougata Bose (University of Liverpool)
  • Soumodev Mal (Chennai Mathematical Institute)
  • Srinidhi Nagendra (IRIF, CNRS, Université Paris Cité, Chennai Mathematical Institute)
  • Stefan Winter (University of Ulm and LMU Munich)
  • Szumi Xie (Eötvös Loránd University)
  • Thomas Holger (FI CODE)
  • Vincent Cheval (University of Oxford)
  • Wei-Lun Tsai (Academia Sinica)
  • Zainab Fatmi (University of Oxford)
  • Zsófia Ádám (Budapest University of Technology and Economics)