Introduction
Assessment and physical attendance have traditionally played a central role in measuring students’ progress and level of engagement with their studies. These days, following a progressive move to blended and online learning, many universities are starting to collect and store other kinds of real-time evidence on students’ day-to-day learning and engagement, including, among others, their interaction with the course website, usage of the library and learning materials, past grades, timeliness of assignment submissions (Broughan & Prinsloo,
2020). Learning Analytics (LA), which provides means for collecting and analysing such evidence, has the potential to identify at-risk students and provide insights into how teaching and learning may be improved (Jones,
2019).
LA data are increasingly aggregated and presented to the end user in the form of a Learning Analytics Dashboard (LAD) (e.g., Matcha et al.,
2019). Using real-time LA evidence, LADs provide a single focal point and a snapshot of students’ learning progress through one or more visualisations and learning recommendations (Sedrakyan et al.,
2020). LADs are viewed as a promising way of facilitating self-regulated learning (e.g., Rienties et al.,
2018; Schumacher & Ifenthaler,
2018), which refers to the students’ ability to set goals for their own learning, as well as monitor, control and adjust their behaviour to achieve them (Sedrakyan et al.,
2020). Additional emerging evidence suggests that having access to a LAD has a positive impact on students’ grades, motivation to study and retention behaviour (de Quincey et al.,
2019; Jivet et al.,
2021).
Despite the promising potential of LADs to support students’ learning, these tools have been criticised for being too ‘quantified’ self focused, and lacking students’ feedback (de Quincey et al.,
2019; Jivet et al.,
2021; Zawacki-Richter et al.,
2019). LADs are often designed for educators and university staff members as the target audience. Very few LADs are developed for use by students and informed by direct and comprehensive student evaluation, whose data are aggregated in the tool. A systematic literature review by Bodily and Verbert (
2017) showed that out of 94 articles identified on student-facing LADs, only 6% included some form of student needs assessment.
Such a salient gap in research, that is analysing students’ reactions to LADs and their impact on learning, reflects more generally some of the inherent tensions that exist in the LA field. On the one hand, universities are committed to collect, measure, analyse and use student data to improve learning and act as custodians of academic standards. On the other hand, students, and the student voice, are often not part of the conversation (e.g., de Quincey et al.,
2019; Zawacki-Richter et al.,
2019). As Broughan and Prinsloo (
2020) explained: ‘In the social imaginary of LA, students are habitually seen as the producers of data and as data-objects, but not as equals’ (p. 619). With the acknowledgement that students can become ‘objectified’ by LA, as well as with the increasing number of studies showing that technology used to support learning does not fit all (Rets et al.,
2020; Rienties et al.,
2020), there is a clear need to move towards a more student-centred approach to LA.
Exploring student voices in the context of distance learning is particularly important. Online distance education has seen huge increases in enrolments during the COVID-19 pandemic (Blackman,
2020), mainstreaming online teaching and learning across the different levels of education. Yet, this is not without challenges. Online teachers often rely on clues to assess whether their students are engaging and progressing, whereas distance learners studying at their own pace and time may or may not have the necessary skills to regulate their learning. LADs could be an important tool in the hands of teachers and students that can help them reflect on their learning, their learning progress, and possibly any support needed. Despite the current relevance of LADs and the fact that distance universities due to their size often have a greater capacity to collect larger amounts and quality of student data than traditional campus-based universities, as more learning interactions happen online, there are very few studies that have analysed distance learners’ response to and perceived usefulness of LADs.
Taking these two gaps into account—lack of LA research with the direct involvement of students in general, and distance learners in particular—this study details the perceptions of undergraduate students at a large distance university in the UK about the various elements of a LAD. It aims to investigate students’ perceived usefulness of the tool and what critical factors influence their responses.
Discussion
Despite an increased use of learning analytics dashboards (LADs) by higher education institutions, there is a lack of in-depth research on how students evaluate and respond to this tool. This gap in student-facing learning analytics (LA) research and the fact that the few studies on this topic have been conducted in traditional campus-based settings using mainly quantitative approaches, inspired this study. To that end, this qualitative study examined distance university students’ perceived usefulness of a LAD and investigated what critical factors influenced their responses.
As part of RQ1 we explored what element(s) of the LAD were perceived as most and least useful. Our study showed that the Study Recommender (element 5 in Fig.
3) was commended by all participants. Currently most LADs used by universities are outcome oriented (e.g., they help answer the question ‘How do I perform?’) (e.g., Sedrakyan et al.,
2020). Our study supports the suggestions put forward by previous studies that LADs should provide more process-oriented feedback (e.g., ‘How can I do better?’) (e.g., Herodotou et al.,
2020a; Schumacher & Ifenthaler,
2018; Sedrakyan et al.,
2020).
Our study further showed that the Similar Students graph (element 4) was perceived as the least useful element of the LAD. While many participants still welcomed the social comparison functionality of the LAD in the VLE Engagement graph (element 1), our study demonstrated that in order for the social comparison to be informative for students, it needs to provide some information links between the individual student and the students they are comparing themselves with. Such information links may include other students’ exam scores, engagement with the course or insights into why they are behind or on top of their studying. This finding corroborates the assumptions behind the social comparison theory, which highlights the perceived closeness and relatedness with the peers as important attributes for social comparison (Festinger,
1954).
In contrast to the evidence from previous studies, that recorded students’ concerns over the control of their data, which impacted the likelihood of their trust in the LAD (Bodily & Verbert,
2017; de Quincey et al.,
2019; Klein et al.,
2019), our study revealed a uniformly positive attitude across the sample towards the university ‘surveilling’ their learning and designing the LAD based on the data it collects. On the one hand, this finding might be due to the fact that the OU was among the first higher education institutions to adopt clear ethical principles on the processing of student data, and, thus, our participants might have been more aware of the ethical implications in LA. However, participants in this study emphasised the usefulness of the LAD particularly in the distance learning context, where they do not have a chance to meet their cohort and benchmark their study progress, or where they can lose sight of their progress due to studying on a part-time basis. This finding constitutes an important insight, as, to our knowledge, there have been very few studies on student-facing LADs in distance education.
At the same time, our findings as part of RQ2 on the factors that explain the perceived usefulness of the LAD, showed that not all distance learners might benefit equally from the LAD. Their responses to the tool were influenced by the extent to which they trusted the information in the LAD, their attitudes towards peer comparison, the extent to which they found the LAD informative and motivating to reflect on their approaches to learning. Finally, their overall enjoyment of working with the LAD and whether they highlighted its advantages or suggested improvement were the substantial factors of influence.
Linking the interview data with the demographic and the academic achievement data available on participants revealed that in its current design, the LAD seemed more likely to appeal the most to younger students (< 40 years old) with low self-efficacy, who were successful in terms of their course pass rate, but who had medium-range scores.
These participants’ low self-efficacy manifested itself in the difficulties they had with evaluating their learning progress, apprehending their scores and more generally—understanding whether they are on track. The functionality of the LAD to pull information on their learning from many sources into one view and to benchmark their progress against that of their peers enhanced their academic self-confidence, while with some participants it reduced their anxieties about the scores for upcoming assignments.
This preliminary finding on the relationship between academic achievement, self-efficacy and the extent to which students find the LAD motivating supports the earlier study of Kim et al. (
2016), who showed that high-achieving students showed lower satisfaction with the LAD than the low academic achievers. Furthermore, participants in our study, who commented that the LAD could prompt them to adjust their study patterns, attributed it to seeing a drop in their engagement or predicted grades. This finding is in line with de Quincey et al. (
2019), where 80% of the students commented on a similar trend. In contrast to de Quincey et al. (
2019) our analysis showed that such a response to the LAD is not uniform among students, and it is the students with low self-efficacy, who are more likely to be prompted by the LAD.
Our study further revealed that more mature students (> 40 years old) with high self-efficacy, who passed the course with distinction perceived the LAD as less useful. Besides commenting on the LAD not being very informative, these participants and particularly older students displayed low levels of appreciation and trust towards the LAD. They tended to probe and at times disagree with the design of the LAD and felt sensitive towards off-track score predictions. This finding is in line with previous studies that showed that people are generally less forgiving towards an algorithm than to humans, when the former makes a mistake, even when its overall performance is fine, with older people being less lenient (e.g., Araujo et al.,
2020; Staddon,
2020). Since age has been shown to be negatively related to technology acceptance, it might explain the low levels of LAD appreciation among these students.
Our finding concerning mixed attitudes among students towards the usefulness of the peer comparison information presented in the LAD supports previous studies, which interpreted this finding from the position of the achievement goal theory (e.g., Beheshitha et al.,
2016). It was beyond the scope of this study to record participants’ goal orientations. However, the analysis of participants’ comments about peer comparison suggests that many participants, who were sceptical about this element, might have had a tendency towards mastery (‘learning as an end itself), rather than performance goals (Pintrich,
2000; Sedrakyan et al.,
2020). Although, previous research has shown that performance goals are not always maladaptive (Senko,
2016), our study supports the implications from Jivet et al. (
2017) and Sedrakyan et al. (
2020) in that the design of LADs should accommodate students with different goal orientation types. It should show awareness of the learning outcomes students target and support a student in both raising their social awareness and performing equally to peers (e.g., reaching the average score), as well as achieving the desired skills and competences.
Finally, our study showed that the improvements verbalised by participants that would increase the perceived usefulness of the LAD included the desire for greater personalisation and tracking of students’ progress against their own learning goals, tracking their progress across the curriculum and their learning with printed materials, providing more process-oriented feedback and alerting them if their performance falls below their usual levels. Some of these suggested improvements support the findings of the previous studies on the sought-after functionality of the LAD (e.g., Schumacher & Ifenthaler,
2018; Sedrakyan et al.,
2020).
Limitations
This study detailed the results of the first attempt of the university under study—the Open University (OU)—to expose its students to the LAD and enable them to see and comment on their own LA data. While this study provided emerging evidence on how different students perceive the usefulness of this tool and why, future research on the students, who have routine access to a LAD, might build a more naturalistic account of their perceptions of the tool. Furthermore, due to the small number of participants in this study, we treat the findings on the elicited differences in the perceived usefulness of the LAD between ‘distinction’ and ‘pass’ students, and between the more mature and younger students as emerging and exploratory. Future research in the form of a large-scale quantitative study should explore further the relation between demographic data and the preferences related to the LAD.
Secondly, our study showed that mainly successful students responded to take part in the study. This result corroborates the finding of Broos et al. (
2017), who revealed that most students who clicked through from the email research invitation to the LAD were high-achieving students. This evidence is informative, considering that many efforts in LA are focused on at-risk students and drop-out prevention (Broos et al.,
2017; Klein et al.,
2019; Matcha et al.,
2019). However, future studies should also explore how at-risk students interact with the LAD.
Finally, as this study was conducted at a large distance university, this raises issues of generalisability of the outcomes across other (campus-based) universities. At the same time, considering the current relevance of this context in the aftermath of the global pandemic and the fact that distance learning provides a unique context, where students with very diverse profiles (e.g., age) study together, the insights from this study have important implications for the design of LADs.
Practical implications
One practical implication stems from the evidence that there were no ‘surveillance’ concerns about LA in this study and many participants wanted the LAD to be more personalised to them as learners. Thus, universities might consider collecting additional data about students in close consultation with them, such as the information about their offline learning, their personal learning goals, as well as psychometric data, such as the level of students’ self-efficacy and anxiety, which would increase the perceived usefulness of this tool. In the scenarios where such collection and storage of fine-grained personal data are not possible, it would be useful to incorporate the elements of a content management system into the LAD. With the latter functionality, the end user—the student—would be able to edit the LAD and personalise it, by, for example, logging in their offline study hours or setting up alerts and notifications about their study performance.
Secondly, the LAD could be used as part of teacher’s feedback on marked assignments to help students reflect on their learning progress. As our study showed, such reflection is important both for the students who feel they are behind and the students who are too far ahead with their learning material and it becomes counter-productive for their learning. This implication from our study concerning the use of the LAD as part of teachers’ feedback can also be supported by previous research. For example, Herodotou et al. (
2020b) showed that the students, whose teachers used the LAD presented significantly better performance than their peers from the previous year, whose same respective teachers made no use of the LAD.
The third implication concerns the idea that the LAD should have the functionality to point students to additional information or clarification, written in an accessible, non-technical format, about how the different elements in the LAD are designed, why some predictions can be off-track, and why mouse clicks can be considered a good indication of students’ engagement and, thus, learning. Such clarifications have the potential to increase trustworthiness, and, thus, perceived usefulness of the LAD.
Finally, while the Study Recommender was perceived as most useful in this study, our study also showed that in order for the LAD to be useful for the more mature high-achieving students, it needs to provide more features, which would centre around the insights of effective learning and guide students in their development as life-long learners.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.