skip to main content
10.1145/1148170.1148239acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
Article

On ranking the effectiveness of searches

Published:06 August 2006Publication History

ABSTRACT

There is a growing interest in estimating the effectiveness of search. Two approaches are typically considered: examining the search queries and examining the retrieved document sets. In this paper, we take the latter approach. We use four measures to characterize the retrieved document sets and estimate the quality of search. These measures are (i) the clustering tendency as measured by the Cox-Lewis statistic, (ii) the sensitivity to document perturbation, (iii) the sensitivity to query perturbation and (iv) the local intrinsic dimensionality. We present experimental results for the task of ranking 200 queries according to the search effectiveness over the TREC (discs 4 and 5) dataset. Our ranking of queries is compared with the ranking based on the average precision using the Kendall t statistic. The best individual estimator is the sensitivity to document perturbation and yields Kendall t of 0.521. When combined with the clustering tendency based on the Cox-Lewis statistic and the query perturbation measure, it results in Kendall t of 0.562 which to our knowledge is the highest correlation with the average precision reported to date.

References

  1. E. Yom-Tov, S. Fine, D. Carmel and A. Darlow. Learning to estimate query difficulty: including applications to missing content detection and distributed information retrieval. In Proceedings of the 28th Annual international ACM SIGIR Conference on Research and Development in Information Retrieval. Salvador, Brazil, 2005 Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. S. Cronen-Townsend, Y. Zhou and B. Croft. Predicting Query Performance. Proceedings of the 25th Annual International ACM SIGIR conference on Research and Development in Information Retrieval. Tampere, Finland, 2002 Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. G. Amati, C. Carpineto and G. Romano. Query difficulty, robustness and selective application of query expansion. In Proceedings of the 25th European Conference on Information Retrieval. Sunderland, Great Britain, 2004Google ScholarGoogle ScholarCross RefCross Ref
  4. B. He and I. Ounis. Inferring Query Performance Using Pre-retrieval Predictors. In Proceedings of the 11th Symposium on String Processing and Information Retrieval, Padova, Italy, 2004Google ScholarGoogle ScholarCross RefCross Ref
  5. C. J. van Rijsbergen. Information Retrieval. Butterworths, London, Second Edition, 1979 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. A. K. Jain and R. C. Dubes. Algorithms for Clustering Data. Prentice-Hall Advanced Reference Series, Year : 1988 Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. T. F. Cox and T. Lewis. A conditional distance ratio method for analyzing spatial patterns. Biometrika 63, 483--491, 1976Google ScholarGoogle ScholarCross RefCross Ref
  8. A. Tombros and C.J. van Rijsbergen. Query-sensitive similarity measures for Information Retrieval. Invited paper, Knowledge and Information Systems, 2004 Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. K. Fukunaga and D.R. Olsen. An Algorithm for finding intrinsic dimensionality of data. IEEE Transactions on Computers, C-20(2), pp. 176--183, 1971Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. T. P. Minka. Automatic Choice of Dimensionality for PCA. MIT Media Laboratory Perceptual Computing Section Technical Report No. 514Google ScholarGoogle Scholar
  11. The Lemur Toolkit for Language Modeling and Information Retrieval, http://www.lemurproject.org/.Google ScholarGoogle Scholar

Index Terms

  1. On ranking the effectiveness of searches

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SIGIR '06: Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
      August 2006
      768 pages
      ISBN:1595933697
      DOI:10.1145/1148170

      Copyright © 2006 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 6 August 2006

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      Overall Acceptance Rate792of3,983submissions,20%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader