ESOP, FASE and FoSSaCS Artifact Evaluation

Information on submission and evaluation of artifacts for the ESOP, FASE and FoSSaCS conferences.

Background

ESOP, FASE and FoSSaCS 2024 will have a joint post-paper-acceptance voluntary artifact evaluation. Authors will be welcome to submit artifacts for evaluation after paper notification. The outcome will not alter the paper acceptance decision.

We want to encourage authors to provide more substantial evidence to their papers, and reward authors who create research artifacts. At the same time, we want to simplify the reproduction of results presented in the paper and ease future comparison with existing approaches. This is why ESOP, FASE, and FoSSaCS will offer an optional artifact evaluation for accepted papers.

Artifacts of interest include (but are not limited to) software, tools, frameworks, datasets, test suites, machine-checkable proofs, or any combination of these. We will assess the artifacts themselves and not the quality of the research that produced them, which has been assessed by the program committee of the conference. In particular, the result of the artifact evaluation will not alter the already made paper acceptance decision.

The goal of the review process is to be constructive and to improve the submitted artifacts. An artifact should be rejected only if it cannot be improved to achieve sufficient quality in the given time frame or if it is inconsistent with the paper. Papers whose artifacts are successfully evaluated will be awarded one or two artifact badges in accordance with the EAPLS artifact badging guidelines.

Important dates

  • Artifact submission deadline: January 4th, 2024
  • Author response period: January 10th - 14th, 2024
  • Notification to authors: February 8th

Preparation

We encourage you to read the HOWTO for AEC Submitters and these guidelines for submitting and reviewing proof artifacts. In a nutshell, committee members want artifacts that:

  • contain all dependencies (e.g., a Docker image or a virtual machine)
  • have “Getting Started” guides where all instructions are tested
  • include documentation on the code and layout of the artifact
  • have a short run reviewers can try first (several minutes max)
  • show progress messages (percentage complete) during longer runs

Authors should avoid:

  • requiring a long or complicated setup process to run the artifact
  • downloading content over the internet during experiments or tests
  • closed source software libraries, frameworks, operating systems, and container formats

If the artifact takes several days to run or requires special hardware, please get in touch with the AEC chairs, let us know of the issue, and provide us with (preferably SSH) access to a self-hosted platform for accessing the artifact. You can also provide us with the full artifact and a reduced input set (in addition to the full set) to only partially reproduce your results in a shorter time.

Packaging

When packaging your artifact for submission, please make your artifact as accessible as possible to the AEC members. It should be easy for the AEC members to quickly make progress on the evaluation of your artifact. Please provide some simple scenarios describing concretely how the artifact is intended to be used; for a tool, this would include specific inputs to provide or actions to take, and expected output or behavior in response to this input.

In addition to these very tightly controlled scenarios that you prepare for the AEC members to try out, it may be very useful if you suggest some variations along the way, such that the AEC members will be able to see that the artifact is robust enough to tolerate experiments.

To avoid problems with software dependencies and installation during artifact review, we strongly encourage authors to provide their artifacts as a Docker image or as a virtual machine image in OVF/OVA format containing the artifact already installed.

Any archive file format should be widely supported (e.g., zip, .tar.gz). Please use widely supported open formats for documents, and preferably CSV or JSON for data.

Every artifact submission must include

  • a copy of the accepted paper (pre camera-ready edits are fine, but indicating any intended changes relevant to the artifact evaluation would be appreciated)
  • a README main file describing what the artifact does. Also, there should be a clear description of how to repeat/replicate/reproduce the results presented in the paper.
    • Artifacts that focus on data should cover aspects relevant to understanding the context, data provenance, ethical and legal statements (as long as relevant), and storage requirements.
    • Artifacts that focus on software should cover aspects relevant to how to install and use it (and be accompanied by a small example).
    • Artifacts containing mechanized proofs should cover how the individual parts of the proof relate to formalisms presented in the paper
  • a REQUIREMENTS file covering hardware and requirements (e.g., performance, storage or non-commodity peripherals) and software requirements (e.g., Docker, VM, and operating system). Authors are encouraged to also include machine-readable files describing dependencies, e.g., Dockerfile, Pipfile, dune-project, etc, when such files are relevant to their setup.
  • a STATUS file stating what kind of badge(s) the authors are applying for as well as the reasons why the authors believe that the artifact deserves that badge(s)
  • a LICENSE file describing the terms of use and distribution rights
  • an INSTALL file with installation instructions. These instructions should include notes illustrating a very basic usage example or a method to test the installation. This could be, for instance, on what output to expect that confirms that the code is installed and working; and the code is doing something interesting and useful.
  • All files must be submitted as plain text (e.g., txt, md) or PDF.

Submission

Please make your submission through the ETAPS-hosted HotCRP system: https://etaps.org/hotcrp/.

Every submission must include a URL for downloading the artifact. Please make sure to use hosting platforms for your artifacts that do not track IP addresses, as this would undermine the anonymity of artifact reviewers.

Process

The review process will proceed as follows:

  • Kick-the-tires: Reviewers will check the artifact’s packaging and integrity. In addition, they will look for any possible setup problems that may prevent it from being properly evaluated. Authors will be given a 4-day period to read and respond to reports. The author response period will resemble paper rebuttals, i.e., there will be no interactive discussions. AEC members will phrase any issues encountered as concisely as possible and authors are expected to address these issues in a single response.
  • Artifact assessment: Reviewers will evaluate the artifacts, checking if they live up to the claims made in the accompanying documentation. Based on these reports, and the ensuing discussion between AEC members, the artifact may be awarded one or two badges (see below).

Criteria

An artifact may be awarded one of the following badges.

  • Functional - The artifact is found to be documented, consistent, complete, exercisable, and to include appropriate evidence of verification and validation.
  • Reusable - The artifact is of a quality that significantly exceeds minimal functionality. That is, it has all the qualities of the Functional level, but, in addition, it is very carefully documented and well-structured to the extent that reuse and repurposing is facilitated.

Irrespective of the artifact evaluation outcome, artifacts may be awarded the “Available” badge.

  • Available - Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI for the object is provided. Repositories used to archive data should have a declared plan to enable permanent accessibility (e.g., as in the retention policies for Zenodo or FigShare). The artifact’s DOI must be included in the camera-ready version of the paper.

Artifact Evaluation Committee

AEC Chairs

AEC Members

Conflicts of Interest

Conflicts of interests for AEC members are handled by the chairs. The AEC chairs themselves may submit papers to ESOP, FASE or FoSSaCS, but are barred from submitting artifacts.