Artifact EvaluationECOOP 2025
Information for Reviewers
As a member of the artifact evaluation committee (AEC), your main goal will be to decide which artifact badges should be awarded to the artifacts you review. We have created a reviewing template, aiming at guiding you for each artifact review.
Following last year’s tradition, we will ask authors to provide as much automation as possible in their artifacts, through “push-button evaluations”. We hope this will streamline the artifact reviewing process.
The artifact evaluation will have a single round, for accepted papers at round #1 and round #2 of ECOOP. The review will be split into two phases:
- the kick-the-tires phase aims at quickly checking the reviewers can run the artifact. Reviewers will be able to ask questions to authors to fix any issue they may encounter.
- the artifact assessment is the main reviewing phase.
Nomination forms
Do you want to participate to ECOOP’s artifact evaluation committee? Please fill the ECOOP’25 Artifact Evaluation Self-Nominations form. You will have to review artifacts during the submission period.
The nominations are open and the co-chairs will process them in March.
Important dates are published on the right hand side of this page. Please respect the AEC deadlines; if you foresee any difficulty in doing so, reach out to the AEC chairs as soon as possible.
Call for Artifacts
Research artifacts denote digital objects that were either created by the authors of a research article to be used as part of their study or generated by their experiments (adopted from ACM’s recommendations).
The about page provides more background on artifacts and their seal of quality. This page describes the call for artifact. We also make available the submission template, and the template form reviewers will use.
Background
Artifacts of technical research papers can include supplementary material such as tools, datasets, models, tutorial videos, or substantial items associated with the research or study presented in the paper.
Traditionally, technical research papers are published without including any artifacts, even though the artifacts may serve as crucial and detailed evidence for the quality of the results that the associated paper offers.
These artefacts enhance transparency and support the repeatability of experiments and precise comparison with alternative approaches, thus enabling higher quality in the research area as a whole. Additionally, artefacts enhance the overall understanding of the research by allowing others to review and build upon the work, making it easier for other researchers to perform their experiments, thus helping the original authors disseminate their ideas in detail.
Consequently, artefacts should be taken seriously and recognized separately. We encourage you to submit relevant artefacts along with the manuscript of technical research papers.
ECOOP has a long-standing tradition of offering artifact evaluation dating back to 2013. In addition to providing feedback on the artifacts, we will make evaluation results available to the technical PC to be taken into consideration for the overall assessment in the paper peer review process, such that artifact submissions can help to improve the overall review score. In addition, some artifact evaluation members will also provide paper reviews. The AE process at ECOOP 2024 is a continuation of the AE process at previous ECOOP editions and several other conferences, including ESEC/FSE, OOPSLA, PLDI, ISSTA, HSCC, and SAS: see the authoritative Artifact Evaluation for Software Conferences website.
Badges
-
Functional: The artifacts associated with the research are found to be documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation. In particular, all experimental claims made in the paper should be reproducible through the artifact.
-
Reusable The artifacts associated with the paper are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated – Functional level. Yet, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. For the evaluation of the artifact’s reusability, we will rely on the reuse scenarios that the authors describe in their documentation. Providing an open-source implementation and its source code, and relying on public, open-source benchmarks are good steps to ensure the reusability of an artifact.
Irrespective of the artifact evaluation outcome, artifacts may be awarded the “Available” badge.
- Available - Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI for the object is provided. Only artifacts published on DARTS will receive the available badge.
A selected number of artifacts going beyond expectations of quality will receive a Distinguished Artifact award. The selection procedure will be based on review scores and feedback from the artifact evaluation committee.
Artifact Submission
Submission will open soon via HotCRP: https://ecoop25aec.hotcrp.com.
Artifact Preparation Guidelines
At a high level, we are interested in artifacts that:
- Have no dependencies. Use of docker images is strongly recommended. Virtual machine images in OVF/OVA format containing the artifact can also be provided.
- Have a minimal number of setup steps. Ideally, it should just be importing the docker/VM image.
- Include a short run option, that reviewers can try first before carrying the full review (running in at most 10 minutes).
- Have a push-button evaluation. Ideally, the evaluation can be run through a single script, which performs the computation and generates the relevant figures/experimental data presented in the paper. The evaluation should either display progress messages or expected duration should be provided. This fully automated approach may be a bit more costly to setup, but you won’t have any copy/pasting issues for your paper, and regenerating data is heavily simplified.
- Include some documentation on the code and layout of the artifact.
- Clearly indicate supported/required resource constraints (hardware, OS, …).
- Use widely supported open formats for documents, preferably CSV or JSON for data.
- Document which outputs are associated with which parts of your paper; if possible, please specify table, figure or sub-sections.
Single-blind, post-paper acceptance artifacts: This year, artifact evaluation is exceptionally single-blind and done only for accepted papers.
Note: you must avoid using platforms with tracking access or requiring individual download permissions. We recommend using Zenodo.
Authors are strongly discouraged from:
- Downloading content over the internet during experiments or tests;
- Using closed-source software libraries, frameworks, operating systems, and container formats; and
- Providing experiments or tests that run for multiple days. If the artifact takes several days to run, we ask that you provide us with the full artifact and a reduced input set (in addition to the full set) to only partially reproduce your results in a shorter time. If the artifact requires special hardware, please get in touch with the AEC chairs, let us know of the issue, and provide us with (preferably SSH) access to a self-hosted platform for accessing the artifact.
Artifact Packaging Guidelines
When packaging your artifact for submission, please consider the following. Your artifact should be as accessible as possible to the AEC members, and it should be easy for the AEC members to quickly make progress on the evaluation of your artifact. Please provide some simple scenarios describing concretely how the artifact is intended to be used; for a tool, this would include specific inputs to provide or actions to take, and expected output or behavior in response to this input. In addition to these very tightly controlled scenarios that you prepare for the AEC members to try out, it may be very useful if you suggest some variations along the way, such that the AEC members will be able to see that the artifact is robust enough to tolerate experiments.
To avoid problems with software dependencies and installation during artifact review, artifacts must be made available either as a built Docker image (https://www.docker.com/ ; not just a Dockerfile
) or as a virtual machine image in OVF/OVA format containing the artifact already installed. The artifact must be provided as a self-contained archive file, using a widely supported archive format (e.g., *.zip, *.tgz) with instructions on how to start the container or the VM.
Artifact Submission Guidelines
Every submission must abide by the submission template.
NEW the artifact description needs to be a PDF file following the DARTS template and the description mentioned in our template. This will help streamline the publishing process of accepted artifacts.
Please make sure to use hosting platforms for your artifacts that do not track IP addresses, as this could compromise the anonymity of the reviewers.
Artifact Review Process
Submitted artifacts will go through a two-phase evaluation.
-
Kick-the-tires: Reviewers check the artifact integrity and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM won’t start, immediate crashes on the simplest example, etc.). Authors are informed of the outcome and will be given a 48 hours discussion period with the AEC members. AEC members will try to phrase any issues they may encounter with artifact as concisely as possible and authors are expected to settle these issues in a single response, at first.
-
Artifact assessment: Reviewers evaluate the artifacts, checking if they live up to the claims the authors make in the accompanying documentation.
Related resources
The process for evaluation is based on ACM’s Artifact Review and Badging and NISO’s guidelines for reproducibility badging. However, neither ACM nor NISO are involved in the implementation or evaluation process on behalf of ECOOP.
We will soon provide more detailed instructions by publishing the call for artifacts, artifact submission template and the artifact reviewer template. In the meantime, we advise prospective researchers to read the ACM SIGPLAN’s Empirical Evaluation Guidelines.