From Reproducible Research
These are some links to related work about reproducible research, reproducible research papers, etc.
Other people and labs doing RR efforts
- Reproducible electronic documents: Jon Claerbout and his colleagues at the Stanford Exploration Project initiated (to our knowledge) the discussions about reproducible research.
- Wavelab: David Donoho and his colleagues at the Stanford Statistics Department developed Matlab code to reproduce their results on wavelets.
- Reproducible Neurophysiological Data Analysis: a page by Christophe Pouzat on reproducible research in neurophysiology using R and Sweave.
- Sensorscope: the wireless environmental sensing network developed at EPFL. Detailed descriptions of the sensor platform are available for those interested to reproduce the setup. Documented datasets are also available for people interested to reuse the data.
- Xin Li's source code collection for reproducible research, with links to code for various image processing algorithms.
- Al Hero's lab applies reproducible research for their publications.
- Andrew Davison at CNRS works on facilitating reproducible simulations using Python.
- MD Anderson Cancer Center: Bioinformatics hosts supplementary material for a number of their publications including code and data.
- StatReport, a description of reproducible statistical reporting at the Department of Biostatistics, Vanderbilt University.
- Jon Wellner's page with links about reproducible research.
- VisionBib.Com contains a very large bibliography of computer vision papers, as well as listings of vision-related code and datasets.
- Neil Lawrence's lab has reproducible documents and software on machine learning.
- Literate Programming: Don Knuth's original work on how to make code human-readable and a lot of related things.
- eScience Institute: eScience institute at university of Washington.
- Bob: a signal-processing and machine learning toolbox originally developed by the Biometrics Group at Idiap, and used for reproducible research.
Journals with RR initiatives
- Annals of Internal Medicine: When a paper is accepted, the authors are asked explicitly whether their paper is reproducible. If yes, links are provided to the study protocol, data, and/or statistical code.
- Biometrical Journal: Authors are strongly encouraged to submit computer code and data sets used to illustrate new methods. These will be published as supporting information on the journal's webpage once the paper was accepted for publication.
- Biostatistics: papers are labeled with an R if they are reproducible, C if code is available online and D if data is available. Data and code are published on the journal's website.
- IEEE Transactions on Signal Processing: in the acceptance e-mail from the editor-in-chief, the authors are encouraged to make their code and data available online.
- The Insight Journal: An online, open access journal in medical imaging that requires code as an integral part of the publication. They also allow for online post-publication reviews.
- IPOL: Image Processing On Line, a journal publishing relevant image processing and image analysis algorithms.
Articles about RR (chronologically)
- D. E. Knuth, Literate Programming, The Computer Journal, vol. 27, no. 2, pp. 97–111, May 1984.
- K. Price, Anything You Can Do, I Can Do Better (No You Can’t)..., Computer Vision, Graphics, and Image Processing, Vol. 36, pp. 387-391, 1986, doi:10.1016/0734-189X(86)90083-6.
- J. Claerbout, Electronic documents give reproducible research a new meaning, in Proc. 62nd Ann. Int. Meeting of the Soc. of Exploration Geophysics, 1992, pp. 601–604.
- J. B. Buckheit and D. L. Donoho, WaveLab and Reproducible Research, Dept. of Statistics, Stanford University, Tech. Rep. 474, 1995.
- R. Koenker, Reproducible Econometric Research, Department of Econometrics, University of Illinois, Urbana-Champaign, IL, Tech. Rep., 1996.
- M. Schwab, M. Karrenbach, and J. Claerbout, Making scientific computations reproducible, Computing in Science & Engineering, vol. 2, no. 6, pp. 61–67, Nov. 2000.
- H. D. Vinod, Care and feeding of reproducible econometrics, Journal of Econometrics, vol. 100, no. 1, pp. 87–88, Jan. 2001.
- Jan de Leeuw, Reproducible Research. The Bottom Line, Department of Statistics, UCLA, Department of Statistics Papers, Paper 2001031101, March 2001.
- F. Leisch, Sweave: Dynamic generation of statistical reports using literate data analysis, in Compstat 2002 — Proceedings in Computational Statistics, W. Härdle and B. Rönz, Eds. Physica Verlag, Heidelberg, 2002, pp. 575–580, ISBN 3-7908-1517-9.
- I. Vlad, Reproducibility in computer-intensive sciences, Ad Astra, vol. 1, No. 2, 2002.
- A. J. Rossini and F. Leisch, Literate statistical practice, UW Biostatistics Working Paper Series 194, University of Washington, WA, USA, 2003.
- J. Quirk, Computational Science: "Same Old Silence, Same Old Mistakes" - "Something More Is Needed", Adaptive Mesh Refinement - Theory and Applications, Proceedings of the Chicago Workshop on Adaptive Mesh Refinement Methods, pp. 3-28, Sept. 2003, doi:10.1007/3-540-27039-6_1.
- R. Gentleman and D. Temple Lang, Statistical Analyses and Reproducible Research, Bioconductor Project Working Papers, Working Paper 2, May 2004.
- M. Ruschhaupt, W. Hubery, A. Poustkaz and U. Mansmann, A Compendium to Ensure Computational Reproducibility in High-Dimensional Classification Tasks, Statistical Applications in Genetics and Molecular Biology, vol. 3, no. 1, 2004.
- S. Pakin, Reproducible Network Benchmarks with coNCePTuaL, Proc. Euro-Par '04, pp. 64–71, Lecture Notes in Computer Science Vol. 3149, Sept. 2004.
- M. Barni and F. Perez-Gonzalez, Pushing Science into Signal Processing, IEEE Signal Processing Magazine, vol. 22, no. 4, pp. 119–120, July 2005.
- R. D. Peng, F. Dominici, and S. L. Zeger, Reproducible Epidemiologic Research, American Journal of Epidemiology, 2006.
- H. A. Piwowar, R. S. Day, and D. B. Fridsma, Sharing detailed research data is associated with increased citation rate, PLoS ONE, vol. 2, no. 3, p. e308, March 2007.
- M. Barni, F. Perez-Gonzalez, P. Comesaña, and G. Bartoli, Putting reproducible signal processing into practice: A case study in watermarking, in Proc. IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 4, April 2007, pp. 1261–1264.
- S. Fomel and G. Hennenfent, Reproducible computational experiments using scons, in Proc. IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 4, pp. 1257–1260, April 2007.
- J. Kovacevic, How to encourage and publish reproducible research, in Proc. IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 4, April 2007, pp. 1273–1276.
- P. Marziliano, Reproducible research: A case study of sampling signals with finite rate of innovation, in Proc. IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 4, April 2007, pp. 1265–1268.
- J. Vandewalle, J. Suykens, B. De Moor, and A. Lendasse, State of the art and evolutions in public data sets and competitions for system identification, time series prediction and pattern recognition, in Proc. IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 4, April 2007, pp. 1269–1272.
- P. Vandewalle, G. Barrenetxea, I. Jovanovic, A. Ridolfi, and M. Vetterli, Experiences with reproducible research in various facets of signal processing research, in Proc. IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 4, April 2007, pp. 1253–1256.
- D. Ramage and A. J. Oliner, RA: ResearchAssistant for the Computational Sciences, Proc. Workshop on Experimental Computer Science (ExpCS), June 2007, doi:10.1145/1281700.1281719.
- S. Fomel and J. F. Claerbout, 010005.htm Reproducible Research (Guest Editors' Introduction), Computing in Science and Engineering, vol. 11, no. 1, pp. 5-7, Jan./Feb. 2009, doi:10.1109/MCSE.2009.14.
- D. L. Donoho, A. Maleki, I. U. Rahman, M. Shahram and V. Stodden, Reproducible Research in Computational Harmonic Analysis, Computing in Science and Engineering, vol. 11, no. 1, pp. 8-18, Jan./Feb. 2009, doi:10.1109/MCSE.2009.15.
- R. J. LeVeque, Python Tools for Reproducible Research on Hyperbolic Problems, Computing in Science and Engineering, vol. 11, no. 1, pp. 19-27, Jan./Feb. 2009, doi:10.1109/MCSE.2009.13
- R. D. Peng and S. P. Eckel, Distributed Reproducible Research Using Cached Computations, Computing in Science and Engineering, vol. 11, no. 1, pp. 28-34, Jan./Feb. 2009, doi:10.1109/MCSE.2009.6
- V. Stodden, The Legal Framework for Reproducible Scientific Research: Licensing and Copyright, Computing in Science and Engineering, vol. 11, no. 1, pp. 35-40, Jan./Feb. 2009, doi:10.1109/MCSE.2009.19
- P. Vandewalle, J. Kovacevic and M. Vetterli, Reproducible Research in Signal Processing - What, why, and how, IEEE Signal Processing Magazine, vol. 26, no. 3, May 2009, pp. 37-47.
- A. Swan, The Open Access citation advantage: Studies and results to date, Technical Report, School of Electronics & Computer Science, University of Southampton, 2010.
- P. Van Gorp and S. Mazanek, SHARE: a web portal for creating and sharing executable research papers, Proc. International Conference on Computational Science, pp. 589-597, 2011, doi:10.1016/j.procs.2011.04.062.
- J. P. Mesirov, Accessible Reproducible Research, Science, Vol. 327, no. 5964, pp. 415-416, Jan. 2010, doi:10.1126/science.1179653.
- M. Gavish and D. Donoho, A Universal Identifier for Computational Results, Proc. International Conference on Computational Science, pp. 637-647, 2011, doi:10.1016/j.procs.2011.04.067.
- B. R. Jasny, G. Chin, L. Chong, and S. Vignieri, Again, and Again, and Again ... Introduction to special section on Reproducible Research, Science, Vol. 334, p. 1225, 2011, doi:10.1126/science.334.6060.1225.
- R. D. Peng, Reproducible Research in Computational Science, Science, Vol. 334, p. 1226-1227, 2011, doi:10.1126/science.1213847.
- M. Tomasello and J. Call, Methodological Challenges in the Study of Primate Cognition, Science, Vol. 334, p. 1227-1228, 2011, doi:10.1126/science.1213443.
- B. D. Santer, T. M. L. Wigley and K. E. Taylor, The Reproducibility of Observational Estimates of Surface and Atmospheric Temperature Change, Science, Vol. 334, p. 1232-1233, 2011, doi:10.1126/science.1216273.
- K. Diethelm, The Limits of Reproducibility in Numerical Simulation, IEEE Computing in Science and Engineering, 2012, doi:10.1109/MCSE.2011.21.
- E. Schulte, D. Davison, T. Dye, C. Dominik, A Multi-Language Computing Environment for Literate Programming and Reproducible Research, Journal of Statistical Software, Vol. 46, no. 3, 2012.
Talks and other 'informal' write-ups about reproducible research
- K. Coombes, Sweave: First steps toward reproducible analyses, presentation given at UT M. D. Anderson Cancer Center, Feb. 2007.
- G. Wilson, High-Performance Computing Considered Harmful, May 2008, an interview with Greg Wilson.
- Interview with Roger Barga on Trident, a workbench for scientific workflow for oceanography, Aug. 2008.
- Presentations mini-symposium Store-Share-and-Cite, TU Delft, The Netherlands, June 2011.
- AMRITA: a cross between a document preparation system, a computational engine, and a programming language.
- Article Authoring Add-in for Microsoft Word: a Microsoft Word add-in that enables more metadata to be captured and stored at the authoring stage and enables semantic information to be preserved through the publishing process, which is essential for enabling search and semantic analysis once the articles are archived within information repositories
- Cacher: this package provides tools for caching statistical analyses in key-value databases which can subsequently be distributed over the web.
- CDE: a Linux software packaging tool that enables users to easily reproduce computational experiments and deploy prototype software.
- Clawpack: a reproducible research tool in the development of numerical methods for hyperbolic partial differential equations (PDE) by R. J. LeVeque and others.
- coNCePTuaL: A Network Correctness and Performance Testing Language.
- CWEB: a system for literate programming: structured documentation of code to obtain human-readable programs, by D. Knuth and S. Levy.
- DataCite: helping you to find, access and reuse data.
- Knitr: a package designed to be a transparent engine for dynamic report generation with R.
- Madagascar: an open-source software package for multidimensional data analysis and reproducible computational experiments.
- Noweb: a system for literate programming: structured documentation of code to obtain human-readable programs, by N. Ramsey and others.
- Orcid: Open researcher and contributor ID, a community effort to establish an open, independent registry that is adopted and embraced as the industry’s de facto standard.
- Org-mode: a tool for keeping notes, maintaining TODO lists, planning projects, and authoring documents with a fast and effective plain-text system. See also Babel for its ability to have executable source code in a document.
- RA (ResearchAssistant): a Java library by Daniel Ramage for creating reproducible experiments in Java.
- RRepository: a repository setup for making reproducible research publications available online, based on EPrints.
- RunMyCode: a web platform where you can create a companion site with a paper to allow others to run the corresponding code.
- SHARE: Sharing Hosted Autonomous Research Environments, a method to provide access to a tool that is otherwise cumbersome to install or configure.
- SQLShare: Database-as-a-Service for Researchers.
- Sumatra: a Python-based tool for managing and tracking projects based on numerical simulation or analysis.
- StatWeave: software whereby you can embed statistical code (e.g., SAS, R, Stata, etc.) into a LaTeX or OpenOffice document. A bit like Sweave, but for more languages, developed by Russell V. Lenth.
- TeXmacs: an editing platform with special features for scientists, giving a unified and user friendly framework for editing structured documents with different types of content (text, graphics, mathematics, interactive content, etc.).
- ThePub: an alternative setup for making reproducible research publications available online, using Java.
- Trident: A Scientific Workflow Workbench providing a set of tools based on the Windows Workflow Foundation that addresses scientists’ need for a flexible, powerful way to analyze large, diverse datasets. It includes graphical tools for creating, running, managing, and sharing workflows.
- VCR: Verifiable Computational Research, a tool to label and reproduce results.
- Version Control Systems
- VisTrails: an open-source scientific workflow and provenance management system that provides support for data exploration and visualization.
- Zentity: a research output repository platform that provides a suite of building blocks, tools, and services that help you create and maintain an organization’s digital library ecosystem.
- Inference: a tool for performing reproducible research from within Microsoft Office (Word, Excel) documents, with links to scripts in Matlab, R, etc.
- Reproducible Research Ideas: This site's blog about reproducible research.
- RRPlanet Blog: RRPlanet blog about reproducible research.
- The Endeavour: John D. Cook's blog about statistics, programming, and reproducible research.
- EPrints News: latest news from the developers of the EPrints repository software.
- Open Access Archivangelism: Stevan Harnad's blog about Open Access and related issues.
- The Third Bit: Greg Wilson's blog, also containing comments on reproducible research, life in academia, software engineering, etc.
- Victoria Stodden: blog about internet and democracy, open science, intellectual property, etc.