Abstract
Online reviews as a major form of user-generated contents play a strategic role in eliminating information asymmetry and facilitating transactions in electronic markets. To improve market efficiency, many online retailers use helpfulness voting as a crowdsourcing strategy to identify and rank online reviews. In this study, we analysed a data set of 2187 reviews as well as more than two months’ review ranking data collected from six best-selling products on amazon.com. We found that a disproportionately higher percentage of votes went to early-posted lengthy reviews due to the Matthew effect. We also found that early reviews, once identified as most helpful, could maintain their top ranking status throughout the product life cycle because of the Ratchet effect. The implications of these findings were discussed and strategies for how to mitigate negative impacts of such effects by online retailers were suggested.
Similar content being viewed by others
Notes
O2O, or online to offline, is an ecommerce model popular in China since 2010. It essentially means attracting users online and directing them to a physical store to complete the purchase. It was initially started as Tuan Gou, or Group Buy, in China. Later such model caught on in North America and led to popular startups like Groupon and LivingSocial.
The ranking also makes adjustment according to the base number of votes. For example, for a review that has more than 15 votes, the helpful/total ratio is adjusted to be more than 0.5 higher than reviews with a slightly better ratio but fewer than 15 total votes.
The early amazon review system uses “newest first” ranking criteria. That is, when a new review is posted, it appears at the very beginning of the review page and earlier reviews move down accordingly. Thus, for a popular product, many early-posted helpful reviews are quickly sent to older pages without being noticed by shoppers. To overcome this disadvantage, amazon switched to the current “most helpful first” method. There have been a few insightful discussions about the change of practice by amazon on review management from chronological to ranking by votes. An influential discussion, “The Magic Behind Amazon’s 2.7 Billion Dollar Question”, can be found at http://www.uie.com/articles/magicbehindamazon/
For example, if a review is voted as most helpful by many previous visitors, a new shopper will read this review first and then most likely vote it as helpful too though, at the same time, many subsequent reviews may be equally or even more helpful, providing different product perspectives, but not be noticed or appreciated by the shopper.
We re-examined the ranking status in March 2014 and found all of the reviews were still ranked as the most helpful reviews for their respective underlying products. To check a review on amazon.com, use URL http://www.amazon.com/review/(add review id here).
As recently as March 24, 2014, this most helpful favourable review (ID: R1KEPVFNKQHP5W) was still ranked as most helpful on the amazon.com website.
Amazon.com founder Jeff Bezos mentioned in an HBS interview that his peers initially rejected his idea of allowing consumers to post product reviews on the amazon website because negative reviews would dampen sales (Kirby and Stewart 2007). Though Bezos chose to side with consumers and encourage customers to post reviews, his team may still have concerns about negative comments. Thus, Amazon may adopt the above strategy to mitigate the potential negative impact created by critical reviews.
References
Agrawal, D., Das, S., & Abbadi, A. E. (2011). Big data and cloud computing: current state and future opportunities. Paper presented at the Proceedings of the 14th International Conference on Extending Database Technology, Uppsala, Sweden.
Alt, R., & Zimmermann, H.-D. (2001). Preface: introduction to special section–business models. Electronic Markets, 11(1), 3–9.
Amblee, N., & Bui, T. (2011). Harnessing the influence of social proof in online shopping: the effect of electronic word of mouth on sales of digital microproducts. International Journal of Electronic Commerce, 16(2), 91–114.
Anderson, C. R., & Zeithaml, C. P. (1984). Stage of the product life cycle, business strategy, and business performance. Academy of Management Journal, 27(1), 5–24.
Back, E. (2010). Does amazon vine bias reviews. Retrieved from http://elliottback.com/wp/does-amazon-vine-bias-reviews/.
Bae, S., & Lee, T. (2011). Product type and consumers’ perception of online consumer reviews. Electronic Markets, 21(4), 255–266.
Baek, H., Ahn, J., & Choi, Y. (2012). Helpfulness of online consumer reviews: readers’ objectives and review cues. International Journal of Electronic Commerce, 17(2), 99–126.
Baye, M. R., Morgan, J., & Scholten, P. (2003). The value of information in an online consumer electronics market. Journal of Public Policy & Marketing, 22(1), 17–25.
Cao, Q., Duan, W., & Gan, Q. (2011). Exploring determinants of voting for the “helpfulness” of online user reviews: a text mining approach. Decision Support Systems, 50(2), 511–521.
Chevalier, J. A., & Mayzlin, D. (2006). The effect of word of mouth on sales: online book reviews. Journal of Marketing Research, 43(3), 345–354.
Cui, G., Lui, H.-K., & Guo, X. (2012). The effect of online consumer reviews on new product sales. International Journal of Electronic Commerce, 17(1), 39–58.
Darby, M. R., & Kami, E. (1973). Free competition and the optimal amount of fraud. Journal of Law and Economics, 16(1), 66–86.
Day, G. S. (1981). The product life cycle: analysis and applications issues. The Journal of Marketing, 45(4), 60–67.
Duan, W., Gu, B., & Whinston, A. B. (2008). Do online reviews matter? — an empirical investigation of panel data. Decision Support Systems, 45(4), 1007–1016.
Freixas, X., Guesnerie, R., & Tirole, J. (1985). Planning under incomplete information and the ratchet effect. The Review of Economic Studies, 52(2), 173–191.
French, S. (2007). Web-enabled strategic GDSS, e-democracy and arrow’s theorem: a Bayesian perspective. Decision Support Systems, 43(4), 1476–1484.
Garrett, J. J. (2005). Ajax: a new approach to web applications. from http://www.adaptivepath.com/ideas/ajax-new-approach-web-applications/.
Hess, T., Lang, K. R., & Xu, S. X. (2011). Social embeddedness and online consumer behavior. Electronic Markets, 21(3), 157–159.
Hu, Y., & Li, X. (2011). Context-dependent product evaluations: an empirical analysis of internet book reviews. Journal of Interactive Marketing, 25(3), 123–133.
Hu, N., Liu, L., & Zhang, J. (2008). Do online reviews affect product sales? The role of reviewer characteristics and temporal effects. Information Technology and Management, 9(3), 201–214.
Hu, N., Zhang, J., & Pavlou, P. A. (2009). Overcoming the J-shaped distribution of product reviews. Communications of the ACM, 52(10), 144–147.
Kapoor, G., & Piramuthu, S. (2009). Sequential bias in online product reviews. Journal of Organizational Computing & Electronic Commerce, 19(2), 85–95.
Kim, S.-M., Pantel, P., Chklovski, T., & Pennacchiotti, M. (2006). Automatically assessing review helpfulness. Paper presented at the EMNLP 2006, Sydney, Australia.
Kirby, J., & Stewart, T. A. (2007). The institutional yes: the HBR interview with Jeff Bezos. Harvard Business Review, 85(10), 75–82.
Korfiatis, N., García-Bariocanal, E., & Sánchez-Alonso, S. (2012). Evaluating content quality and helpfulness of online product reviews: the interplay of review helpfulness vs. review content. Electronic Commerce Research and Applications, 11(3), 205–217.
Kornish, L. J. (2009). Are user reviews systematically manipulated? Evidence from the helpfulness ratings. Leeds School of Business Working Paper. Retrieved from http://leeds-faculty.colorado.edu/kornish/lkpapers/kornish-manipulation-of-reviews-dec15-09.pdf.
Liu, J., Cao, Y., Lin, C.Y., Huang, Y. & Zhou, M. (2007). Low-quality product review detection in opinion summarization. Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, 334–342.
McMahon, M. J. (2004). The Matthew effect and federal taxation. Boston College Law Review, 45, 993.
Merton, R. K. (1968). The Matthew effect in science. Science, 159(3810), 56–63.
Miller, N., & Campbell, D. T. (1959). Recency and primacy in persuasion as a function of the timing of speeches and measurements. Journal of Abnormal and Social Psychology, 59(1), 1.
Mudambi, S., & Schuff, D. (2010). What makes a helpful online review? A study of customer reviews on amazon.com. MIS Quarterly, 34(1), 185–200.
Nelson, P. (1970). Information and consumer behavior. Journal of Political Economy, 78(2), 311–329.
Nelson, P. (1974). Advertising as information. Journal of Political Economy, 82(4), 729–754.
Purnawirawan, N., Dens, N., & De Pelsmacker, P. (2012). Balance and sequence in online reviews: the wrap effect. International Journal of Electronic Commerce, 17(2), 71–98.
Rabin, M., & Schrag, J. L. (1999). First impressions matter: a model of confirmatory bias. The Quarterly Journal of Economics, 114(1), 37–82.
Sarwar, B., Karypis, G., Konstan, J., & Riedl, J. (2000). Analysis of recommendation algorithms for e-commerce. Paper presented at the Proceedings of the 2nd ACM conference on Electronic commerce.
Shapiro, C., & Varian, H. R. (1999). Information rules : a strategic guide to the network economy. Boston: Harvard Business School Press.
Shaywitz, B. A., Holford, T. R., Holahan, J. M., Fletcher, J. M., Stuebing, K. K., Francis, D. J., & Shaywitz, S. E. (1995). A Matthew effect for IQ but not for reading: results from a longitudinal study. Reading Research Quarterly, 30(4), 894–906.
Sikora, R. T., & Chauhan, K. (2012). Estimating sequential bias in online reviews: a Kalman filtering approach. Knowledge-Based Systems, 27, 314–321.
Tversky, A., & Kahneman, D. (1973). Availability: a heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232.
Wan, Y., & Nakayama, M. (2014). The reliability of online review helpfulness. Journal of Electronic Commerce Research, 15(3), 179–189.
Weitzman, M. L. (1980). The “Ratchet principle” and performance incentives. The Bell Journal of Economics, 11(1), 302–308.
Wirtz, B. W., Schilke, O., & Ullrich, S. (2010). Strategic development of business models: implications of the Web 2.0 for creating value on the internet. Long Range Planning, 43(2), 272–290.
Ye, Q., Law, R., Gu, B., & Chen, W. (2011). The influence of user-generated content on traveler behavior: an empirical investigation on the effects of e-word-of-mouth to hotel online bookings. Computers in Human Behavior, 27(2), 634–639.
Zhang, J. Q., Craciun, G., & Shin, D. (2010). When does electronic word-of-mouth matter? A study of consumer product reviews. Journal of Business Research, 63(12), 1336–1341.
Zhou, L., Zhang, P., & Zimmermann, H.-D. (2013). Social commerce research: an integrated view. Electronic Commerce Research and Applications, 12(2), 61–68.
Zhu, F., & Zhang, X. (2010). Impact of online consumer reviews on sales: the moderating role of product and consumer characteristics. Journal of Marketing, 74(2), 133–148.
Acknowledgments
This paper is a revised and expanded version of a conference paper entitled “The Matthew effect in online review helpfulness” presented at the 15th International Conference on Electronic Commerce in Turku, Finland, 13–15.8.2013 and published in conference proceedings, “Co-created Effective, Agile, and Trusted eServices,” Lecture Notes in Business Information Processing Volume 155, 2013, pp. 38–49. The author is grateful for the suggestions and comments from conference participants, two anonymous reviewers, one senior editor of EM, and associate editor Harry Bouwman that greatly improved the manuscript.
Author information
Authors and Affiliations
Corresponding author
Additional information
Responsible Editor: Harry Bouwman
Rights and permissions
About this article
Cite this article
Wan, Y. The Matthew Effect in social commerce. Electron Markets 25, 313–324 (2015). https://doi.org/10.1007/s12525-015-0186-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12525-015-0186-x