Dagstuhl Perspectives Workshop on Artifact Evaluation for Publications

I’m pleased to have been invited to a Dagstuhl Perspectives Workshop in November on “Artifact Evaluation for Publications”, in recognition of my work (with colleagues) on computational reproducibility and software sustainability.

Schloss Dagstuhl, Leibniz-Zentrum für Informatik GmbH (Schloss Dagstuhl, Leibniz Center for Informatics) is the world’s premier venue for informatics; the center promotes fundamental and applied research, continuing and advanced academic education, and the transfer of knowledge between those involved in the research side and application side of informatics. The aim of their Seminar and Perspectives Workshop series is to bring together internationally renowned leading scientists for the purpose of exploring a cutting-edge informatics topic; in this case how we can define a roadmap for artifact evaluation in computer systems research (with application more widely across computational science and engineering), defining an actionable research roadmap for increased accountability, rethinking how we evaluate research outputs (particularly software) and document research processes and associated e-infrastructure, as well as how best to change culture and behaviour — and perhaps more importantly, incentivisation structures — for researchers, institutions and governments:

The computer systems research (CSR) community has developed numerous artifacts that encompass a rich and diverse collection of compilers, simulators, analyzers, benchmarks, data sets and other software and data. These artifacts are used to implement research innovations, evaluate trade-offs and analyze implications. Unfortunately, the evaluation methods used for computing systems innovation can be at odds with sound science and engineering practice. In particular, ever-increasing competitiveness and expediency to publish more results poses an impediment to accountability, which is key to the scientific and engineering process. Experimental results are not typically distributed with enough information for repeatability and/or reproducibility to enable comparisons and building on the innovation. Efforts in programming languages/compilers and software engineering, computer architecture, and high-performance computing are underway to address this challenge.


This Dagstuhl Perspectives Workshop brings together leaders of these efforts and senior stakeholders of CSR sub-communities to determine synergies and to identify the promising directions and mechanisms to move the broader community toward accountability. The workshop assesses current efforts, shares what does and doesn’t work, identifies additional processes, incentives and mechanisms, and determines how to coordinate and sustain the efforts. The workshop’s outcome is a roadmap of actionable strategies and steps to improving accountability, leveraging investment of multiple groups, educating the community on accountability, and sharing artifacts and experiments.

 
Organised by Bruce R. Childers (University of Pittsburgh, USA), Grigori Fursin (cTuning, France), Shriram Krishnamurthi (Brown University, USA) and Andreas Zeller (Universität des Saarlandes, Germany), Dagstuhl Perspectives Workshop 15452 takes place from 1-4 November 2015 (see the full list of invited attendees); looking forward to reporting back in November.

Tagged , , , ,

One thought on “Dagstuhl Perspectives Workshop on Artifact Evaluation for Publications

  1. Tom says:

    Our Dagstuhl Perspectives Workshop (15452) report now published: “Artifact Evaluation for Publicationshttp://drops.dagstuhl.de/opus/volltexte/2016/5762/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: