Skip to main content
Log in

Understanding the benefits of providing peer feedback: how students respond to peers’ texts of varying quality

  • Published:
Instructional Science Aims and scope Submit manuscript

Abstract

Prior research on peer assessment often overlooks how much students learn from providing feedback to peers. By practicing revision skills, students might strengthen their ability to detect, diagnose, and solve writing problems. However, both reviewer ability and the quality of the peers’ texts affect the amount of practice available to learners. Therefore, the goal of the current study is to provide a first step towards a theoretical understanding about why students learn from peer assessment, and more specifically from providing feedback to peers. Students from a large Introduction to Psychological Science course were assigned four peers’ papers to review. The reviewing ability of each student was determined, and to whom the students provided feedback was manipulated. The features and focus of the comments from a sample of 186 participants were coded, and the amount of each type was analyzed. Overall, reviewer ability and text quality did not affect the amount of feedback provided. Instead, the content of the feedback was affected by reviewer ability. Low reviewers provided more praise than high reviewers, whereas high reviewers provided more criticism than low reviewers. This criticism from high reviewers described more problems and offered more solutions, and it focused more often on high prose and substance. In the only significant reviewer ability × text quality interaction, high reviewers described more problems in the low-quality texts than in the high-quality texts, whereas low reviewers did not make this distinction. These results suggest that high reviewers and low reviewers may utilize different commenting styles, which could significantly impact the benefits of peer assessment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. The SAT (Scholastic Assessment Test) is a standardized test used for college admissions in the United States. It consists of three sections: the verbal section tests critical reading skills, the writing section tests problem detection skills and grammar and usage knowledge, and the mathematics section tests arithmetic operation, algebra, geometry, statistics, and probability knowledge.

  2. Universities in the U.S. typically require a first year composition course, and the university in the present study requires two semesters of composition.

  3. The turnitin.com peer review functions primarily focused on generating end comments rather than marginalia. Reviewers were able to tag specific locations in the text that could be used in the end comment to indicate where a particular problem existed; however, this function was not obvious and most students did not use it. In addition, the specific commenting prompts were separate from the ratings prompts, which could allow one to create a reviewing assignment that utilized more fine-grained evaluation dimensions and broader commenting dimensions. Finally, the reviews were anonymous—that is, a pseudonym was used to identify both the writer and the reviewer.

References

  • Anderson, J. R., Bothell, D., Byrne, M. D., Douglass, S., Lebiere, C., & Qin, Y. (2004). An integrated theory of the mind. Psychological Review, 111(4), 1036–1060.

    Article  Google Scholar 

  • Anderson, J. R., & Lebiere, C. (Eds.). (1998). The atomic components of thought. Psychology Press.

  • Charney, D. H., & Carlson, R. A. (1995). Learning to write in a genre—what student writers take from model texts. Research in the Teaching of English, 29(1), 88–125.

    Google Scholar 

  • Chi, M. T. H. (1996). Constructing self-explanations and scaffolded explanations in tutoring. Applied Cognitive Psychology, 10(7), 33–49.

    Article  Google Scholar 

  • Chi, M. T. H., Bassok, M., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explantions: How students study and use examples in learning to solve problems. Cognitive Science, 13, 145–182.

    Article  Google Scholar 

  • Cho, Y., & Cho, K. (2011). Peer reviewers learn from giving comments. Instructional Science, 39(5), 629–643.

    Article  Google Scholar 

  • Cho, K., & MacArthur, C. (2011). Learning by reviewing. Journal of Educational Psychology, 103(1), 73–84.

    Article  Google Scholar 

  • Cho, K., & Schunn, C. D. (2007). Scaffolded writing and rewriting in the discipline: A web-based reciprocal peer review system. Computers & Education, 48(3), 409–426.

    Article  Google Scholar 

  • Cohen, J. (1977). Statistical power analysis for the behavioral sciences. New York: Academic Press.

    Google Scholar 

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale: Lawrence Earlbaum Associates.

    Google Scholar 

  • Falchikov, N., & Goldfinch, J. (2000). Student peer assessment in higher education: A meta-analysis comparing peer and teacher marks. Review of Educational Research, 70(3), 287–322.

    Article  Google Scholar 

  • Ferris, D. R. (1997). The influence of teacher commentary on student revision. TESOL Quarterly, 31(2), 315–339.

    Article  Google Scholar 

  • Flower, L., & Hayes, J. R. (1981). A cognitive process theory of writing. College Composition and Communication, 32(4), 365–387.

    Article  Google Scholar 

  • Flower, L., Hayes, J. R., Carey, L., Schriver, K., & Stratman, J. (1986). Detection, diagnosis, and the strategies of revision. College Composition and Communication, 37(1), 16–55.

    Article  Google Scholar 

  • Hayes, J. R., Flower, L., Schriver, K. A., Stratman, J. F., & Carey, L. (1987). Cognitive processes in revision. In S. Rosenberg (Ed.), Advances in applied psycholinguistics (Vol. 2, pp. 176–240)., Reading, writing, and language learning New York: Cambridge University Press.

    Google Scholar 

  • Inuzuka, M. (2005, July). Learning how to write through encouraging metacognitive monitoring: The effect of evaluating essays written by others. Paper presented at the Annual Conference of the Cognitive Science Society, Stresa, Italy.

  • Kaufman, J., & Schunn, C. (2011). Students’ perceptions about peer assessment for writing: their origin and impact on revision work. Instructional Science, 39(3), 387–406.

    Article  Google Scholar 

  • Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254–284.

    Article  Google Scholar 

  • Li, L., Liu, X., & Steckelberg, A. L. (2010). Assessor or assessee: How student learning improves by giving and receiving peer feedback. British Journal of Educational Technology, 41(3), 525–536.

    Article  Google Scholar 

  • Li, L., Liu, X., & Zhou, Y. (2012). Give and take: A re-analysis of assessor and assessee’s roles in technology-facilitated peer assessment. British Journal of Educational Technology, 43(3), 376–384.

    Article  Google Scholar 

  • Logan, G. D. (1988). Toward an instance theory of automatization. Psychological Review, 95(4), 492–527.

    Article  Google Scholar 

  • Lu, J., & Law, N. (2012). Online peer assessment: effects of cognitive and affective feedback. Instructional Science, 40(2), 257–275.

    Article  Google Scholar 

  • Lu, J., & Zhang, Z. (2012). Understanding the effectiveness of online peer assessment: A path model. Journal of Educational Computing Research, 46(3), 313–333.

    Article  Google Scholar 

  • Lundstrom, K., & Baker, W. (2009). To give is better than to receive: The benefits of peer review to the reviewer’s own writing. Journal of Second Language Writing, 18(1), 30–43.

    Article  Google Scholar 

  • National Center for Education Statistics. (2012). The nation’s report card: Writing 2011. Retrieved from http://nces.ed.gov/nationsreportcard/pdf/main2011/2012470.pdf.

  • Nelson, M. M., & Schunn, C. D. (2009). The nature of feedback: how different types of peer feedback affect writing performance. Instructional Science, 37(4), 375–401.

    Article  Google Scholar 

  • Newell, A. (1994). Unified theories of cognition. Cambridge: Harvard University Press.

    Google Scholar 

  • Newell, A., & Rosenbloom, P. (1981). Mechanisms of skill acquisition and the law of practice. In J. R. Anderson (Ed.), Cognitive skills and their acquisition (pp. 1–55). Hillsdale: Lawrence Earlbaum Associates.

    Google Scholar 

  • Patchan, M. M., Charney, D., & Schunn, C. D. (2009). A validation study of students’ end comments: Comparing comments by students, a writing instructor, and a content instructor. Journal of Writing Research, 1(2), 124–152.

    Article  Google Scholar 

  • Patchan, M. M., Hawk, B., Stevens, C. A., & Schunn, C. D. (2013). The effects of skill diversity on commenting and revisions. Instructional Science, 41(2), 381–405.

    Article  Google Scholar 

  • Patchan, M. M., & Schunn, C. D. (under review). Understanding the benefits of receiving peer feedback: A case of matching ability in peer-review.

  • Patchan, M. M., Schunn, C. D., & Correnti, R. J. (under review). The nature of feedback—revisited: How feedback features affect students’ willingness and ability to revise.

  • Roediger, H. L. (2007). Twelve tips for reviewers. APS Observer, 20(4), 41–43.

    Google Scholar 

  • Singley, M. K., & Anderson, J. R. (1989). The transfer of cognitive skill. Cambridge: Harvard University Press.

    Google Scholar 

  • Strijbos, J.-W., & Sluijsmans, D. (2010). Unravelling peer assessment: Methodological, functional, and conceptual developments. Learning and Instruction, 20(4), 265–269.

    Article  Google Scholar 

  • The College Board. (2012). SAT Percentile Ranks. Retrieved from: http://media.collegeboard.com/digitalServices/pdf/research/SAT-Percentile-Ranks-2012.pdf.

  • Thorndike, E. L., & Woodworth, R. S. (1901). The influence of improvement in one mental function upon the efficiency of other functions. Psychological Review, 8(3), 247–261.

    Article  Google Scholar 

  • Topping, K. J. (2005). Trends in PEER LEARNING. Educational Psychology, 25(6), 631–645.

    Article  Google Scholar 

  • Topping, K. J., Dehkinet, R., Blanch, S., Corcelles, M., & Duran, D. (2013). Paradoxical effects of feedback in international online reciprocal peer tutoring. Computers & Education, 61, 225–231.

    Article  Google Scholar 

  • Wallace, D. L., & Hayes, J. R. (1991). Redefining Revision for Freshmen. Research in the Teaching of English, 25(1), 54–66.

    Google Scholar 

  • Wooley, R. S., Was, C., Schunn, C. D., & Dalton, D. (2008). The effects of feedback elaboration on the giver of feedback. Paper presented at the Cognitive Science, Washington DC

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Melissa M. Patchan.

Appendices

Appendix 1

See Table 2.

Table 2 Peer feedback coding scheme

Appendix 2

See Table 3.

Table 3 Example of segmentation and coding of one piece of feedback

Appendix 3

See Table 4.

Table 4 Descriptive & Inferential Statistics: Amount, Features, and Focus of Comments

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Patchan, M.M., Schunn, C.D. Understanding the benefits of providing peer feedback: how students respond to peers’ texts of varying quality. Instr Sci 43, 591–614 (2015). https://doi.org/10.1007/s11251-015-9353-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11251-015-9353-x

Keywords

Navigation