CSCW 2012 Update (Guest post by Jonathan Grudin)
A guest post by Jonathan Grudin.
A summary of events since CSCW 2012 as we head toward CSCW 2013.
The publication culture of computer science received unparalleled attention in 2012. Experiments in conference reviewing and a new ACM policy signal that past approaches aren’t seen to be working well. Some changes are more radical than the CSCW 2012 revision cycle, which has been used in at least four other conferences such as SIGMOD.
How is CSCW 2012 looking? Downloads and citations are imperfect measures of impact, but they are the available lamppost. ACM’s CSCW 2012 downloads exceed 25,000, more than CSCW 2011 has accumulated over two years or CSCW 2010 over three years. This is a dramatic increase in conference impact. (In comparison, CHI 2012 has about half as many downloads as CHI 2011.)
Citations accumulate more heavily in the second and third years after a conference. That said, there are now five times as many ACM-recorded CSCW 2012 citations as CSCW 2011 citations a year ago (with a higher per-paper citation rate than CHI 2012, though the latter was a couple months later). In a couple years, frequency distributions will provide a more nuanced sense of the effects of doubling the CSCW acceptance rate to 40%.
A panel at the biennial Computing Research Association conference at Snowbird discussed conference-journal and open access issues. In November, 30 senior researchers attended an invited three-day Dagstuhl workshop on the same issues. ACM released a policy aimed at preventing certain kinds of conference-journal hybrids and encouraging others. CACM published many commentaries on these topics, including one this month by Gloria Mark, John Riedl, and me that describes CSCW 2012 and other experiments. I have an essay in the current Interactions that suggests an evolutionary biology analogy. Links to all this stuff are below.
Some core CS areas can experiment more easily than SIGCHI because their prestigious conferences are smaller and they have members who like building tools. At Snowbird and Dagstuhl there was appreciation for the CSCW 2012 experiment and negligible concern for acceptance rates. Some conferences maintain low acceptance rates primarily to maintain a single track conference; their organizers realize that they reject high quality work, at a price. My view is that low conference acceptance rates are a cancer with a higher mortality rate than most of us survivors realize, but not all of our community agrees.
I hope to see you in San Antonio. — Jonathan
Below are links to event materials and documents. The first two papers will be freely accessible at http://research.microsoft.com/~jgrudin/ when ACM Author-izer permits, in about a month.
Conference-journal hybrids. Grudin, J., Mark, G. & Riedl, J. January 2013 CACM. http://dx.doi.org/10.1145/2398356.2398371
Journal-conference interaction and the competitive exclusion principle. Grudin, J. Jan-Feb 2013 Interactions. http://dx.doi.org/10.1145/2405716.2405732
Snowbird panel (6/24/2012) on publication models in computing research. http://www.cra.org/events/snowbird-2012/
Dagstuhl workshop (11/2012) on the publication culture of computing research. http://www.dagstuhl.de/12452
List of CACM articles with links (2009-2013) on publication issues, and other resources. http://research.microsoft.com/~jgrudin/CACMviews.pdf
2012 ACM policy on conference and journal reviewing practices. http://www.acm.org/sigs/volunteer_resources/conference_manual/acm-policy-on-the-publication-of-conference-proceedings-in-acm-journals