skip to main content
10.1145/1518701.1518810acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Scientometric analysis of the CHI proceedings

Authors Info & Claims
Published:04 April 2009Publication History

ABSTRACT

The CHI conference has grown rapidly over the last 26 years. We present a quantitative analysis on the countries and organizations that contribute to its success. Only 7.8 percent of the countries are responsible for 80 percent of the papers in the CHI proceedings, and the USA is clearly the country with most papers. But the success of a country or organization does not depend only on the number of accepted papers, but also on their quality. We present a ranking of countries and organizations based on the h index, an indicator that tries to balance the quantity and quality of scientific output based on a bibliometric analysis. The bibliometric analysis also allowed us to demonstrate the difficulty of judging quality. The papers acknowledged by the best paper award committee were not cited more often than a random sample of papers from the same years. The merit of the award is therefore unclear, and it might be worthwhile to allow the visitor to the conference to vote for the best paper.

Skip Supplemental Material Section

Supplemental Material

1518810.mp4

mp4

343.4 MB

References

  1. Greenberg, S. and Buxton, B. Usability evaluation considered harmful (some of the time). ACM, City, 2008.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Barkhuus, L. and Rode, J. From Mice to Men-24 Years of Evaluation in CHI. Twenty-fith annual SIGCHI conference on Human factors in computing systems - Alt. CHI2007). Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Wulff, W. and Mahling, D. E. An assessment of HCI: issues and implications. SIGCHI Bulletin, 22, 1 1990), 80--87. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Newman, W. A preliminary analysis of the products of HCI research, using pro forma abstracts. ACM, City, 1994.Google ScholarGoogle Scholar
  5. Garfield, E. The History and Meaning of the Journal Impact Factor. Journal of the American Medical Association, 295, 1 (January 4, 2006 2006), 90--93.Google ScholarGoogle Scholar
  6. Lotka, A. J. The frequency distribution of scientific productivity. Journal of the Washington Academy of Sciences, 16, 12 1926), 317--323.Google ScholarGoogle Scholar
  7. Bartneck, C. What Is Good? - A Comparison Between The Quality Criteria Used In Design And Science. ACM, City, 2008.Google ScholarGoogle Scholar
  8. Bartneck, C. and Rauterberg, M. HCI Reality - An Unreal Tournament. International Journal of Human Computer Studies, 65, 8 2007), 737--743. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Jefferson, T., Wager, E. and Davidoff, F. Measuring the Quality of Editorial Peer Review. Journal of the American Medical Association, 287, 21 (June 5, 2002 2002), 2786--2790.Google ScholarGoogle Scholar
  10. Lawrence, P. A. The politics of publication. Nature, 422, 6929 2003), 259--261.Google ScholarGoogle ScholarCross RefCross Ref
  11. Horrobin, D. F. The philosophical basis of peer review and the suppression of innovation. Journal of the American Medical Association, 263, 10 (March 9, 1990 1990), 1438--1441.Google ScholarGoogle Scholar
  12. Enserink, M. SCIENTIFIC PUBLISHING: Peer Review and Quality: A Dubious Connection? Science, 293, 5538 (September 21, 2001 2001), 2187a--2188.Google ScholarGoogle ScholarCross RefCross Ref
  13. Peters, D. P. and Ceci, S. J. Peer-review practices of psychological journals: The fate of published articles, submitted again. Behavioral and Brain Sciences, 5, 2 1982), 187--195.Google ScholarGoogle ScholarCross RefCross Ref
  14. Holden, G., Rosenberg, G. and Barker, K. Bibliometrics in social work. Haworth Social Work Practice Press, Binghamton, 2005.Google ScholarGoogle Scholar
  15. Moed, H. F. Citation analysis in research evaluation. Springer, Dordrecht, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Garfield, E. Is citation analysis a legitimate evaluation tool? Scientometrics, 1, 4 1979), 359--375.Google ScholarGoogle ScholarCross RefCross Ref
  17. MacRoberts, M. and MacRoberts, B. Problems of citation analysis. Scientometrics, 36, 3 1996), 435--444.Google ScholarGoogle ScholarCross RefCross Ref
  18. Lawrence, P. A. Lost in publication: how measurement harms science. Ethics in Science and Environmental Politics, 8, 1 (June 03, 2008 2008), 9--11.Google ScholarGoogle ScholarCross RefCross Ref
  19. Universities UK The use of bibliometrics to measure research quality in UK higher education institutions. Universities UK, 2007.Google ScholarGoogle Scholar
  20. Katz, J. and Hicks, D. Desktop scientometrics. Scientometrics, 38, 1 1997), 141--153.Google ScholarGoogle ScholarCross RefCross Ref
  21. Pauly, D. and Stergiou, K. I. Equivalence of results from two citation analyses: Thomson ISI's Citation Index and Google's Scholar service. Ethics in Science and Environmental Politics, 20052005), 33--35.Google ScholarGoogle ScholarCross RefCross Ref
  22. Jacso, P. As we may search-comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases. Current Science, 89, 9 2005), 1537--1547.Google ScholarGoogle Scholar
  23. Jacso, P. Dubious hit counts and cuckoo's eggs. Online Information Review, 30, 2 2006), 188--193.Google ScholarGoogle ScholarCross RefCross Ref
  24. Jacso, P. Deflated, inflated and phantom citation counts. Online Information Review, 30, 3 2006), 297--309.Google ScholarGoogle ScholarCross RefCross Ref
  25. Harzing, A. W. K. and van der Wal, R. Google Scholar as a new source for citation analysis. Ethics In Science And Environmental Politics, 82008), 61--73.Google ScholarGoogle Scholar
  26. Meho, L. I. and Yang, K. Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58, 13 2007), 2105--2125. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Hirsch, J. E. An index to quantify an individual's scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102, 46 (November 15, 2005 2005), 16569--16572.Google ScholarGoogle ScholarCross RefCross Ref
  28. Egghe, L. Theory and practise of the g-index. Scientometrics, 69, 1 2006), 131--152.Google ScholarGoogle ScholarCross RefCross Ref
  29. Marzal, A. and Vidal, E. Computation of normalized edit distance and applications. IEEE Transactions on Pattern Analysis and Machine Intelligence, 15, 9 1993), 926--932. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. United Nations Department of Economic and Social Affairs. World Population Prospects. New York, 2006.Google ScholarGoogle Scholar
  31. Orr, J. M., Sackett, P. R. and Dubois, C. L. Z. Outlier Detection And Treatment In I/O Psychology: A Survey Of Researcher Beliefs And An Empirical Illustration. Personnel Psychology, 44, 3 1991), 473--486.Google ScholarGoogle Scholar
  32. Green, R. G. The Paradox of Faculty Publications in Professional Journals. Haworth Press, City, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  33. Gold, V. SIGCHI Announces Best of CHI 2008 Award Winners: Conference Honors Research that Addresses Problems of Accessibility, Homelessness, Healthcare, Emerging Markets. ACM, City, 2008.Google ScholarGoogle Scholar
  34. Lawrence, S. Free online availability substantially increases a paper's impact. Nature, 411, 6837 2001), 521--521.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Scientometric analysis of the CHI proceedings

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2009
      2426 pages
      ISBN:9781605582467
      DOI:10.1145/1518701

      Copyright © 2009 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 4 April 2009

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '09 Paper Acceptance Rate277of1,130submissions,25%Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader