Peer evaluation in online anchored discussion for an increased local relevance of replies
Introduction
Computer supported collaborative learning (CSCL) offers students possibilities for a deep and active processing of their subject matter, especially when complementing face-to-face (F-2-F) interaction (Dietz-Uhler and Bishop-Clark, 2001, Järvelä and Häkkinen, 2002). As stated by Lapadat (2002), asynchronous online discussion in particular facilitates reflection, conceptual change and the collaborative construction of meaning, making it especially suitable for the collaborative processing of academic literature. This study – which aims to facilitate students’ collaborative processing of literature – will use a specialised form of online discussion called “anchored discussion” (Bernheim Brush, Bargeron, Grudin, Borning, & Gupta, 2002), that integrates students’ online discussion with the subject matter that is being discussed. Van der Pol, Admiraal, and Simons (2006) have demonstrated that anchored discussion is better suited for supporting the early stages of collaborative text processing than regular forum discussion, in that it affords a more efficient and meaning-oriented collaboration by relating the discussion more closely with the subject matter.
However, anchored discussion might still have some constraints with regard to the coherence of students’ interaction and the learning potential of their peer-directed replies. In this article, we develop and investigate an enhanced tool for anchored discussion that aims to increase the quality of students’ replies.
Section snippets
Constructive learning conversations and the relevance of replies
While the level of interactional or “cross-turn” coherence of online learning conversations seems to be an important factor in determining the effectiveness of students’ online collaboration (Hoadley and Enyedy, 1999, Hsi, 1997), it is also often identified as a problematic element (Herring, 1999, Reyes and Tchounikine, 2003, Van der Meij et al., 2005). Students’ difficulties with preserving the interactional coherence are understandable, because students – who do not yet fully master the
Obstacles for achieving local relevance in online learning conversations
The importance and difficulty of maintaining the local relevance of replies in online learning conversations may not only be related to students’ limited understanding of the subject matter, but also to some of its basic communicative constraints. Not only do online discussions generally lack non-verbal information, but their delayed nature and the relatively high investments for writing messages (compared to synchronous or F-2-F communication) generally produce less messages, or ‘turns’ than
Increasing local relevance by providing an evaluation function
Having identified the local relevance of replies as a crucial but problematic element in creating constructive online theoretical learning conversations, we may ask ourselves how to facilitate or stimulate the creation of relevant replies. A known approach to improve the quality of students’ online interaction is to develop tools that provide the participants with information about their collaboration, described by Soller, Martinez, Jermann, and Mühlenbrock (2005) as mirroring and awareness
Method
We have set up a comparative study with three types of anchored discussion: two versions with an additional evaluation function and one regular version, without an evaluation function, to function as a control condition. Our hypothesis is that replies in the anchored discussion with an evaluation function will be more locally relevant than in the discussion groups without an evaluation function. Because we do not yet know which specific design of the evaluation function can be expected to
Results
After describing students’ participation in the online learning conversations in general, we will quantitatively compare the local relevance scores (our dependent variable) in the experimental and control conditions. These scores result from coding the messages according to the developed coding scheme and will function as dependent variable in the analyses. Finally, we will give a view on how students used the evaluation functions by looking at the provided argumentations.
Conclusion and discussion
In response to the original research question, we can conclude that the presence of an evaluation function in a tool for anchored discussion can indeed increase the local relevance of students’ replies. However, this effect was only found for the “receiver evaluates”-version of the two evaluation functions that were tested in this study. In the next paragraphs, we will address to which aspect of (the use of) the two evaluation versions this difference in effectiveness can be attributed, why
References (32)
- et al.
Improving quality and quantity of contributions: Two models for promoting knowledge exchange with shared databases
Computers and Education
(2007) - et al.
The use of computer-mediated communication to enhance subsequent face-to-face discussions
Computers in Human Behaviour
(2001) - et al.
The assessment tool: A method to support asynchronous communication between computer experts and laypersons
Computers in Human Behavior
(2006) The use of communication strategies in computer-mediated communication
System
(2003)- et al.
An examination of international coherence in email use in elementary school
Computers in Human Behaviour
(2005) - et al.
Promoting effective helping behavior in peer-directed groups
International Journal of Educational Research
(2003) - et al.
Where is education heading and how about AI
International Journal of Artificial Intelligence in Education
(1999) - Bellamy, R. (1997). Support for learning conversations. Abstract retrieved November 11, 2002, from...
- et al.
Supporting interaction outside of class: Anchored discussion vs. discussion boards
- et al.
Hierarchical linear models
(1992)