I just got pointed to the author guidelines for CVPR 2010. They state that reviewers will be asked about (indicative) reproducibility (or repeatability, as it is called there):
Repeatability Criteria: The CVPR 2010 reviewer form will include the following additional criteria, with rating and associated comment field: “Are there sufficient algorithmic and experimental details and available datasets that a graduate student could replicate the experiments in the paper? Alternatively, will a reference implementation be provided?”. During paper registration, authors will be asked to answer the following two checkbox questions: “1. Are the datasets used in this paper already publicly available, or will they be made available for research use at the time of submission of the final camera-ready version of the paper (if accepted)? 2. Will a reference implementation adequate to replicate results in the paper be made publicly available (if accepted)?” If either these boxes are checked, the authors should specify in the submitted paper the scope of such datasets and/or implementations so that the reviewers can judge the merit of that aspect of the submission’s contribution. The Program Chairs realize that for certain CVPR subfields providing such datasets, implementations, or detailed specification is impractical, but in other areas it is reasonable and sometimes even standard, so on balance repeatability is a relevant criteria for reviewer consideration. “N.A.” will be an available reviewer score for this field, as it is for other fields.
Very exciting developments!