skip to main content
10.1145/2556288.2557081acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

What if we ask a different question?: social inferences create product ratings faster

Published:26 April 2014Publication History

ABSTRACT

Consumer product reviews are the backbone of commerce online. Most commonly, sites ask users for their personal opinions on a product or service. I conjecture, however, that this traditional method of eliciting reviews often invites idiosyncratic viewpoints. In this paper, I present a statistical study examining the differences between traditionally elicited product ratings (i.e., "How do you rate this product'") and social inference ratings (i.e., "How do you think other people will rate this product'"). In 5 of 6 trials, I find that social inference ratings produce the same aggregate product rating as the one produced via traditionally elicited ratings. In all cases, however, social inferences yield less variance. This is significant because using social inference ratings 1) therefore converges on the true aggregate product rating faster, and 2) is a cheap design intervention on the part of existing sites.

References

  1. N. Archak, A. Ghose, and P. G. Ipeirotis. Show me the money!: deriving the pricing power of product features by mining consumer reviews. In Proc. KDD, pages 56--65, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. N. Christin. Traveling the silk road: A measurement analysis of a large anonymous online marketplace. In Proc. WWW, pages 213--224, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. C. Dellarocas, X. M. Zhang, and N. F. Awad. Exploring the value of online product reviews in forecasting sales: The case of motion pictures. Journal of Interactive marketing, 21(4):23--45, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  4. A. Ghose and P. Ipeirotis. The economining project at nyu: Studying the economic value of user-generated content on the internet. Journal of Revenue & Pricing Management, 8(2):241--246, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  5. E. Gilbert and K. Karahalios. Understanding deja reviewers. In Proc. CSCW, pages 225--228, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. G. Hart and L. E. Staveland. Development of nasa-tlx (task load index): Results of empirical and theoretical research. Human mental workload, 1(3):139--183, 1988.Google ScholarGoogle Scholar
  7. J. A. Konstan and J. Riedl. Recommender systems: from algorithms to user experience. User Modeling and User-Adapted Interaction, 22(1-2):101--123, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. J. Laherrere and D. Sornette. Stretched exponential distributions in nature and economy: "fat tails" with characteristic scales. The European Physical Journal B-Condensed Matter and Complex Systems, 2(4):525--539, 1998.Google ScholarGoogle ScholarCross RefCross Ref
  9. Y. Liu, X. Huang, A. An, and X. Yu. Modeling and predicting the helpfulness of online reviews. In Proc. ICDM, pages 443--452, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. D. T. Miller and C. McFarland. Pluralistic ignorance: When similarity is interpreted as dissimilarity. Journal of Personality and social Psychology, 53(2):298, 1987.Google ScholarGoogle Scholar
  11. M. Ott, C. Cardie, and J. Hancock. Estimating the prevalence of deception in online review communities. In Proc. WWW, pages 201--210, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. D. Prelec. A bayesian truth serum for subjective data. Science, 306(5695):462--466, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  13. D. A. Prentice and D. T. Miller. Pluralistic ignorance and alcohol use on campus: some consequences of misperceiving the social norm. Journal of personality and social psychology, 64(2):243, 1993.Google ScholarGoogle Scholar
  14. A. D. Shaw, J. J. Horton, and D. L. Chen. Designing incentives for inexpert human raters. In Proc. CSCW, pages 275--284, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. A. Talwar, R. Jurca, and B. Faltings. Understanding user behavior in online feedback reporting. In Proc. EC, pages 134--142, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Q. Ye, R. Law, and B. Gu. The impact of online user reviews on hotel room sales. International Journal of Hospitality Management, 28(1):180--182, 2009.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. What if we ask a different question?: social inferences create product ratings faster

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2014
      4206 pages
      ISBN:9781450324731
      DOI:10.1145/2556288

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 26 April 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '14 Paper Acceptance Rate465of2,043submissions,23%Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader