Publish Now, Judge Later

Similar documents
NSF Grad (and Other) Fellowships: Why Apply?

If the journal is online, this information may not be circumvented by the reader bypassing a location containing this information.

NSF Grad (and Other) Fellowships: Why Apply?

The Rhetoric of Proposals

The Institutional Repository Project: Why, What, When

GUIDELINES FOR RESEARCH PROPOSAL SUBMISSION AND EVALUATION UVAWELLASSA UNIVERSITY

Preparing for Proposal Writing

CANO/ACIO RESEARCH GRANTS 2018

NSERC Management Response: Evaluation of NSERC s Discovery Program

Virginia Sea Grant Graduate Research Fellowship Deadline: November 13, 2015

APPLYING FOR EXTERNAL RESEARCH FUNDING / ATT SÖKA OM EXTERNA FORSKNINGSMEDEL LAURA J. DOWNING, PROF. OF AFRICAN LANGUAGES

ABMS Organizational QI Forum Links QI, Research and Policy Highlights of Keynote Speakers Presentations

ACVIM Candidate Award for those eligible. Possible Not allowed No 25 minute oral. Possible Not allowed No 25 minute oral

Mathematics/Statistics NSF GRFP Seminar Writing Studio

CALL FOR PAPERS The 5 th International Conference on

National Science Foundation Annual Report Components

F1000 Bringing Transparency to Peer Review

ACM SAC 2015 Track Chair Guidelines (Revised May 19, 2014)

DEMYSTIFYING THE HHS WAIVER PROCESS

Building Inclusive Communities

REQUEST FOR PROPOSALS JAMES H. ZUMBERGE FACULTY RESEARCH & INNOVATION FUND ZUMBERGE INDIVIDUAL RESEARCH AWARD

Spring 2014: NSF CAREER presentation and panel discussion

TE18 Review Process and Responsibilities

Request for proposal for providing services to the Oberlin Group for the launch of a new Open Access publishing venture for the liberal arts

The Anatomy and Art of Writing a Successful Grant Application: A Practical Step-by-Step Approach

Request for Proposals for Faculty Research

Marine Ecology Research Society Research Grant Program GUIDELINES FOR APPLICATION

Call for Presentations and Posters

Museum Assessment Program. Grant Writing Guide

Risk Adjustment Methods in Value-Based Reimbursement Strategies

RI:2015 RESEARCH INFRASTRUCTURES. instruction for reviewers

Guidance on writing successful grant applications. Guidance on writing successful grant applications

European Research Council. Alex Berry, European Advisor 15 December 2015, Royal Holloway

Ethics in Research Cathy Constable and Steve Constable Geophysics Research Discussion Week 4: Writing Papers and Proposals

Writing an Effective Grants for Art Programs Proposal

2016 NSF Grad Fellowship Workshop

Towards a Common Strategic Framework for EU Research and Innovation Funding

ICSB Taipei, Taiwan RESHAPING THE WORLD THROUGH INNOVATIVE SMES. 63rdAnnual World Congress. June

RESEARCH FUNDING: SECURING SUPPORT PROPOSAL FOR YOUR PROJECT THROUGH A FUNDING. Professor Bryan Scotney

Tips for Writing Successful Grant Proposals During Surgical Residency. Pamela Derish Scientific Publications Office UCSF Department of Surgery

Darwin Initiative: Post Project Awards

Begin Implementation. Train Your Team and Take Action

Instructions for Completing Form 3201

Big data in Healthcare what role for the EU? Learnings and recommendations from the European Health Parliament

Award Competitions and Travel Funding Opportunities for Students, Medical Trainees, Postdoctoral Fellows, and Young Investigators

TABLE OF CONTENTS Guidelines About the Leukemia & Lymphoma Society Description of Awards Who Can Apply General Eligibility Criteria

SHOULD I APPLY FOR AN ARC DECRA? GUIDELINES

DEMYSTIFYING THE PUBLICATION PROCESS. Peter Harries, PhD Professor of Geosciences and Assistant Dean, USF Office of Graduate Studies

SHOULD I APPLY FOR AN ARC FUTURE FELLOWSHIP? GUIDELINES

Management Response to the International Review of the Discovery Grants Program

Full application deadline Noon on April 4, Presentations to Scientific Review Committee (if invited) May 11, 2016

ALS Canada-Brain Canada Discovery Grants

Call for Scientific Session Proposals

NSF-BSF COLLABORATIONS IN BIOLOGY. Theresa Good Acting Division Director Molecular and Cellular Biosciences September 2017

By ticking this box, I confirm that I meet the overseas applicant eligibility criteria for the Networking Grants

Instructions for National Science Foundation (NSF)-style proposals

Residents, Interns, Graduate Students, Undergraduate Students, Professional Students. ACVIM Candidate Award for those eligible

Research Proposals from A to Z. Cynthia Wilson Garvan, PhD Statistics Director OER and Ana Puig, PhD Research Director OER

Terms of Reference: ALS Canada Project Grant Program 2018

NSF Faculty Early Career Development (CAREER) Program. April 23, 2015

TOPIC #1: SHIFTING AWAY FROM COUNTERPRODUCTIVE FUNDING MODELS. The Unintended Consequences of Typical Non-profit Funding Model

Life Sciences Simons Collaboration on the Origins of Life Fellowship Policies and Procedures

FELLOWSHIP TRAINING GRANT PROPOSAL

Graduate Student Council Research Grants Program

ACCOMPLISHMENTS: What was done? What was learned?

Common Elements of Grant Proposals Tips and Best Practices

Center for Cultural Innovation Investing in Tomorrow Grants Change capital for shaping the future of the arts by Bay Area visionaries

Spread Pack Prototype Version 1

Requests for Proposals

User-Friendly Ideas for Project Evaluation. Broader Impacts Evaluation Workshop November 28, 2012

Request for Proposals SD EPSCoR Research Infrastructure Improvement Track-1 Award

Life Sciences Simons Collaboration on the Global Brain (SCGB) Fellowships

Below you will find instructions and a link to the online application for the National Academy of Education/ Spencer Postdoctoral Fellowship.

c. Description: An in-depth summary of the event, including as much information as you re able to share.

Mathematics/Statistics NSF GRFP Seminar Information Session

October 2015 TEACHING STANDARDS FRAMEWORK FOR NURSING & MIDWIFERY. Final Report

Stakeholder and Multiplier Engagement Strategy

Nordic Open Access. Background and Developments. 10th Fiesole Collection Development Retreat March 28-29, 2008

A 21 st Century System of Patient Safety and Medical Injury Compensation

Toward A Scholarship of Outreach and Engagement in Higher Education

CIHR Project Scheme 1 st Live Pilot Workshop. Office of the Vice Dean Research and Innovation. Faculty of Medicine. Questions and Answers

Call for abstracts. Submission deadline: 31 st October Submission guidelines

UNIVERSITY OF NEW MEXICO RESEARCH ALLOCATIONS COMMITTEE (RAC) GUIDELINES FOR GRANTS

REQUEST FOR PROPOSALS THE ROSE HILLS FOUNDATION INNOVATOR GRANT PROGRAM RESEARCH FELLOWSHIP APPLICATION

J-PAL North America Education Technology Request for Proposals (RFP) Proposal Instructions

The NSF Graduate Research Fellowship Program

2017 Bobby Sox Softball

CIP Publications Policy

Writing Persuasive Proposals

DRAFT. PUBLIC UTILITIES COMMISSION OF THE STATE OF CALIFORNIA Item 18 (Rev.1) Agenda ID ENERGY DIVISION RESOLUTION G-3522 November 10, 2016

What is Southeast Asia? Exploring Uniqueness and Diversity

Request for Proposals for Student Research

Approved by: UMMG Executive Committee. Date Approved: NOVEMBER 22, 2011

Pamela Derish Scientific Publications Office v UCSF Department of Surgery. Gain needed knowledge in specific areas (through coursework, tutorials)

DESIGN COMPETITIONS: Why? And what it takes.

This article is Part 1 of a two-part series designed. Evidenced-Based Case Management Practice, Part 1. The Systematic Review

Request for Applications Instructions. ACCP RI Futures Grants: Fellows & Jr. Investigators

S.779/HR Fair Access to Science and Technology Research (FASTR) Act of 2015

June 25, Dear Administrator Verma,

Secrets of Successful NSF CAREER Proposals

Transcription:

VIEWPOINT Publish Now, Judge Later By Douglas B. Terry Microsoft Research Silicon Valley Abstract Conferences these days face a reviewing crisis with too many submissions and not enough time for reviewers to carefully evaluate each submission. Numerous good papers get rejected. One possible solution is for conferences to accept any paper that extends our body of knowledge and let the community judge the paper s significance. The Problem Conferences in the computing field have large numbers of submissions, overworked and overly critical reviewers, and low acceptance rates. Conferences boast about their low acceptance rates as if this were the main metric for evaluating the conference s quality. With strict limits placed on the number of accepted papers, conference program committees face a daunting task in selecting the top papers, and even the best committees reject papers from which the community could benefit. Rejected papers get resubmitted many times over to different conferences before these papers are eventually accepted or the authors give up in frustration. Good ideas go unpublished or have their publication delayed, to the detriment of the research community. Poor papers receive little attention and do not get the constructive feedback necessary to improve the paper or the work. Because reviewers approach their job knowing that they must eventually reject 4 out of 5 submissions (or more), they often focus on finding reasons to reject a paper. Once they formulate such a reason, correctly or incorrectly, they pay less thought to the rest of the paper. They do not adequately consider whether the flaws could be corrected through modest revisions or whether the good points outweigh the bad. Papers with the potential for long-term impact get rejected in favor of papers with easily evaluated, hard to refute results. Program committees spend considerable time trying to agree on the best 20% of the papers that were submitted rather than providing comments to improve the papers for the good of all. Even if committees were able to perfectly order submissions according to quality, which they aren t, papers that are close in quality may receive different outcomes since the line needs to be drawn somewhere. People do not always get the credit they deserve for inventing a new technique when their submission is rejected and some later work is published first. A Proposal My proposed solution is simple. Conferences should accept and publish all reasonable submissions. Some fields, such as Physics, I m told, hold large annual conferences where anyone can talk about

almost anything. I am not suggesting that our conferences accept every submission. I believe that computing conferences should enforce some standards for publication quality, but our current standards are far too stringent. We might argue about what constitutes a reasonable publication. Keeping in mind that the main purpose of publication is to teach others, here s what I suggest. A submission is reasonable, and hence publishable, if it contains something new (a novel idea, new experimental result, validation of previous results, new way of explaining something, and so on), is based on sound methodology, explains the novelty in a clear enough manner for others to learn from it, and puts the new results in a proper context, that is, compares the results fairly to previous work. Rather than looking for reasons to reject a paper or spending time comparing papers, the role of conference reviewers is (a) to assess whether each submission is reasonable according to this criteria, and, perhaps more importantly, (b) to offer concrete suggestions for improvement. Any paper meeting this criteria should be accepted for publication, perhaps with shepherding to ensure that the reviewers suggestions are properly followed. Ultimately, papers will be judged in the fairness of time by accepted bibliometrics, such as citation counts, and, more importantly, by their impact on the field and on the industry. The importance of a published paper is often not known for many years. The 10 years after or hall of fame awards should be used as the way to honor the best papers. These awards should be noted in the ACM Digital Library. Search engines, along with collaborative filtering and public recommendations, could direct researchers to high-quality, relevant work. Practical Issues What if a conference accepts more papers than can be presented during the length of the conference? In the steady state, this may not be a serious problem since there are lots of conferences and not that many new papers. If papers stop being submitted (and rejected) from a half dozen conferences, we will end up with far fewer submissions to conferences. To deal with large numbers of papers, conferences may need to have parallel sessions or shorter presentations or both. Personally, I am a fan of shorter presentations. An author should be able to present the key idea behind his work in 10-15 minutes and let people read the paper for more detail. Some papers could be presented as posters only, but I am not a fan of this approach. I would prefer to see all accepted papers treated equally. Let the community judge the papers. How do authors decide where to submit their papers? Conferences will still have topics of focus. For example, we ll still have conferences on databases, algorithms, systems, networks, etc. One additional criterion for acceptance is that the paper fits the topical scope of the conference. Some papers may fit into multiple conferences. For example, a paper on distributed storage systems could be a database paper and a systems paper, i.e. be suitable for presentation at SIGMOD or SOSP. In this case, since the criteria for accepting papers is the same for all conferences, it doesn t matter much to which conference the paper is submitted. In either case, assuming they are ACM conferences, the paper will end up in the Digital Library. Most likely, an author will submit his paper to the conference that attracts the community to which he mostly closely aligns, such as a conference that is sponsored by a Special

Interest Group (SIG) to which he belongs. Low quality conferences will likely go away, leaving one top conference in each technical area or for each technical community. To me, having fewer conferences would be a good thing. What prevents people from submitting papers containing the least publishable unit? Authors can decide for themselves when they have a significant result that they want to share with the community. Getting ideas and results published quickly is a good thing. There s no reason that someone should wait until they have a full paper s worth of results before submitting their work. The length of the paper can be commensurate with its contributions. People who submit lots of short papers with very marginal contributions risk harming their reputations and will likely receive fewer test of time awards than those that submit more major results. That may be sufficient incentive to discourage overly incremental submissions. How would this affect journals? I suspect that journal submissions would go up and more emphasis would be placed on journal publications. Journals would continue to have distinguished review boards that accept and reject papers based on quality. Thus, a journal publication will be viewed as more prestigious than a conference paper. Papers with early results that are presented at conferences may later become journal articles with more substantial results, refined ideas, or practical experiences. Results from multiple conference papers may be combined into more comprehensive journal papers. This could make the publication practices for computing research more similar to those of other scientific disciplines. Alternative Proposals I am certainly not the first to observe flaws in our current publication practices or to suggest changes [7,8]. Attendees at a recent Dagstuhl Perspectives Workshop on the Publication Culture in Computing Research spent days debating alternatives. That workshop prompted this position statement. Others have suggested modifications to our publication processes, such as open access [3] and post-publication peer reviews [5], and a number of these viewpoints have already appeared in CACM [4,6,9]. New services have been deployed for some communities, such as PubZone [2] which fosters public discussion of published papers in the database field. These practices and systems merit consideration, but are mostly orthogonal to what I propose. Public web sites, like the Computing Research Repository (CoRR) [1], have been established to encourage the rapid dissemination of new ideas. Authors may choose to make their papers immediately available by depositing them in such a repository. This approach addresses some of the problems that I raise, but differs in three fundamental ways. First, the authors do not get the thrill or experience of presenting their work in front of a live conference audience. Second, the deposited papers generally are later submitted for publication in a more established conference or journal. Therefore, concerns remain about repeated submissions and its load on reviewers. Third, and most importantly, the papers are not peer-reviewed. My proposal retains pre-publication peer review. Thus, authors benefit from receiving constructive feedback that should be considered when revising their papers in advance of publication,

and readers benefit from the knowledge that the work was vetted by a distinguished program committee. How to Get There Adopting new publication policies is not simple. I do not expect established conferences to change their practices overnight. Conferences have a vested interest in protecting their hard-earned reputations by maintaining low acceptance rates. University computer science departments have succeeded at getting promotion committees to value conference publications, and are reluctant to make changes that might damage that position. Nevertheless, I believe that gradual steps are possible. As an encouraging trend, I know of a couple of recent systems conferences that accepted more papers than usual while continuing as single-track conferences. Serving as a program committee member for one of those conferences (MobiSys 2012), I observed first-hand the difficulty of getting reviewers to alter their mindsets and accept even marginally more submissions. One way to move forward is to establish new high acceptance conferences in addition to the existing low acceptance conferences. Adding more conferences is not a good long-term solution, but could nudge the community in the right direction, provide experimental data, and spark discussion. For example, this year SIGOPS is running a new conference, the Conference on Timely Results in Operating Systems (TRIOS), in conjunction with its highly regarded Symposium on Operating Systems Principles (SOSP). This experimental conference will accept papers that are rejected from SOSP but still make a significant contribution. Lessons learned from this experiment will feed into a broader discussion of publication practices at the SIGOPS business meeting. We expect it to provide insights into whether the community values conferences with less-constrained acceptance rates and whether authors will choose to present their work at such a conference or wait for publication opportunities that might look better on their resumes. Concluding Remarks My main proposal is that conferences accept and publish any submission that contributes something new to our body of knowledge and that conveys its contribution in a clear and fair manner. The benefits of accepting any reasonable conference submission and abandoning low acceptance rates are clear: Research results get published in a timelier manner. Reviewers focus on providing constructive feedback. Program committees do not waste time reviewing the same submissions over and over again. Credit goes to those who first conceive of an idea and to groups that develop similar ideas in parallel. The community judges work by its long-term impact. However, it does require a fundamental shift in how the research community, as well as tenure committees and other review boards, evaluates conference publications. I believe that some kind of shift is needed.

References 1. CoRR: Computing Research Repository, http://arxiv.org/corr/home. 2. PubZone Scientific Publication Discussion Forum, http://pubzone.org/. 3. Michel Beaudouin-Lafon. Open Access to Scientific Publications, Communications of the ACM, February 2012. 4. Bertrand Meyer, Christine Choppy, Jorgen Staunstrup, and Jan van Leeuwen. Research Evaluation for Computer Science, Communications of the ACM, April 2009. 5. Cameron Neylon. Reforming Peer Review. What are the Practical Steps?, March 8, 2011, http://cameronneylon.net/blog/reforming-peer-review-what-are-the-practical-steps/. 6. David Roman. Scholarly Publishing Model Needs an Update, Communications of the ACM, January 2011. 7. Jack Rosenberger. Should Computer Scientists Change How They Publish?, BLOG@CACM, July 29, 2012. 8. Moshe Y. Vardi. Revisiting the Publication Culture in Computing Research, Communications of the ACM, March 2010. 9. Dan S. Wallach. Rebooting the CS Publication Process, Communications of the ACM, October 2011.