ABSTRACT
The CHI conference has grown rapidly over the last 26 years. We present a quantitative analysis on the countries and organizations that contribute to its success. Only 7.8 percent of the countries are responsible for 80 percent of the papers in the CHI proceedings, and the USA is clearly the country with most papers. But the success of a country or organization does not depend only on the number of accepted papers, but also on their quality. We present a ranking of countries and organizations based on the h index, an indicator that tries to balance the quantity and quality of scientific output based on a bibliometric analysis. The bibliometric analysis also allowed us to demonstrate the difficulty of judging quality. The papers acknowledged by the best paper award committee were not cited more often than a random sample of papers from the same years. The merit of the award is therefore unclear, and it might be worthwhile to allow the visitor to the conference to vote for the best paper.
Supplemental Material
Available for Download
Slides from the presentation
- Greenberg, S. and Buxton, B. Usability evaluation considered harmful (some of the time). ACM, City, 2008.Google ScholarDigital Library
- Barkhuus, L. and Rode, J. From Mice to Men-24 Years of Evaluation in CHI. Twenty-fith annual SIGCHI conference on Human factors in computing systems - Alt. CHI2007). Google ScholarDigital Library
- Wulff, W. and Mahling, D. E. An assessment of HCI: issues and implications. SIGCHI Bulletin, 22, 1 1990), 80--87. Google ScholarDigital Library
- Newman, W. A preliminary analysis of the products of HCI research, using pro forma abstracts. ACM, City, 1994.Google Scholar
- Garfield, E. The History and Meaning of the Journal Impact Factor. Journal of the American Medical Association, 295, 1 (January 4, 2006 2006), 90--93.Google Scholar
- Lotka, A. J. The frequency distribution of scientific productivity. Journal of the Washington Academy of Sciences, 16, 12 1926), 317--323.Google Scholar
- Bartneck, C. What Is Good? - A Comparison Between The Quality Criteria Used In Design And Science. ACM, City, 2008.Google Scholar
- Bartneck, C. and Rauterberg, M. HCI Reality - An Unreal Tournament. International Journal of Human Computer Studies, 65, 8 2007), 737--743. Google ScholarDigital Library
- Jefferson, T., Wager, E. and Davidoff, F. Measuring the Quality of Editorial Peer Review. Journal of the American Medical Association, 287, 21 (June 5, 2002 2002), 2786--2790.Google Scholar
- Lawrence, P. A. The politics of publication. Nature, 422, 6929 2003), 259--261.Google ScholarCross Ref
- Horrobin, D. F. The philosophical basis of peer review and the suppression of innovation. Journal of the American Medical Association, 263, 10 (March 9, 1990 1990), 1438--1441.Google Scholar
- Enserink, M. SCIENTIFIC PUBLISHING: Peer Review and Quality: A Dubious Connection? Science, 293, 5538 (September 21, 2001 2001), 2187a--2188.Google ScholarCross Ref
- Peters, D. P. and Ceci, S. J. Peer-review practices of psychological journals: The fate of published articles, submitted again. Behavioral and Brain Sciences, 5, 2 1982), 187--195.Google ScholarCross Ref
- Holden, G., Rosenberg, G. and Barker, K. Bibliometrics in social work. Haworth Social Work Practice Press, Binghamton, 2005.Google Scholar
- Moed, H. F. Citation analysis in research evaluation. Springer, Dordrecht, 2005. Google ScholarDigital Library
- Garfield, E. Is citation analysis a legitimate evaluation tool? Scientometrics, 1, 4 1979), 359--375.Google ScholarCross Ref
- MacRoberts, M. and MacRoberts, B. Problems of citation analysis. Scientometrics, 36, 3 1996), 435--444.Google ScholarCross Ref
- Lawrence, P. A. Lost in publication: how measurement harms science. Ethics in Science and Environmental Politics, 8, 1 (June 03, 2008 2008), 9--11.Google ScholarCross Ref
- Universities UK The use of bibliometrics to measure research quality in UK higher education institutions. Universities UK, 2007.Google Scholar
- Katz, J. and Hicks, D. Desktop scientometrics. Scientometrics, 38, 1 1997), 141--153.Google ScholarCross Ref
- Pauly, D. and Stergiou, K. I. Equivalence of results from two citation analyses: Thomson ISI's Citation Index and Google's Scholar service. Ethics in Science and Environmental Politics, 20052005), 33--35.Google ScholarCross Ref
- Jacso, P. As we may search-comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases. Current Science, 89, 9 2005), 1537--1547.Google Scholar
- Jacso, P. Dubious hit counts and cuckoo's eggs. Online Information Review, 30, 2 2006), 188--193.Google ScholarCross Ref
- Jacso, P. Deflated, inflated and phantom citation counts. Online Information Review, 30, 3 2006), 297--309.Google ScholarCross Ref
- Harzing, A. W. K. and van der Wal, R. Google Scholar as a new source for citation analysis. Ethics In Science And Environmental Politics, 82008), 61--73.Google Scholar
- Meho, L. I. and Yang, K. Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58, 13 2007), 2105--2125. Google ScholarDigital Library
- Hirsch, J. E. An index to quantify an individual's scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102, 46 (November 15, 2005 2005), 16569--16572.Google ScholarCross Ref
- Egghe, L. Theory and practise of the g-index. Scientometrics, 69, 1 2006), 131--152.Google ScholarCross Ref
- Marzal, A. and Vidal, E. Computation of normalized edit distance and applications. IEEE Transactions on Pattern Analysis and Machine Intelligence, 15, 9 1993), 926--932. Google ScholarDigital Library
- United Nations Department of Economic and Social Affairs. World Population Prospects. New York, 2006.Google Scholar
- Orr, J. M., Sackett, P. R. and Dubois, C. L. Z. Outlier Detection And Treatment In I/O Psychology: A Survey Of Researcher Beliefs And An Empirical Illustration. Personnel Psychology, 44, 3 1991), 473--486.Google Scholar
- Green, R. G. The Paradox of Faculty Publications in Professional Journals. Haworth Press, City, 2005.Google ScholarCross Ref
- Gold, V. SIGCHI Announces Best of CHI 2008 Award Winners: Conference Honors Research that Addresses Problems of Accessibility, Homelessness, Healthcare, Emerging Markets. ACM, City, 2008.Google Scholar
- Lawrence, S. Free online availability substantially increases a paper's impact. Nature, 411, 6837 2001), 521--521.Google ScholarCross Ref
Index Terms
- Scientometric analysis of the CHI proceedings
Recommendations
Scientometric analysis of Iraqi-Kurdistan universities' scientific productivity
Purpose - This purpose of this study is to examine research performance of Iraqi-Kurdistan universities, using the number of papers appearing in journals and proceedings, and the number of citations received by those papers as covered by Scopus, 1970-...
Analyzing information systems researchers' productivity and impacts: A perspective on the H index
Quantitative assessments of researchers' productivity and impacts are crucial for the information systems (IS) discipline. Motivated by its growing popularity and expanding use, we offer a perspective on the h index, which refers to the number of papers ...
Scientometric analysis of scientific publications in CSCW
Over the last decades, CSCW research has undergone significant structural changes and has grown steadily with manifested differences from other fields in terms of theory building, methodology, and socio-technicality. This paper provides a quantitative ...
Comments