Skip to main content
Erschienen in: Education and Information Technologies 3/2023

Open Access 08.09.2022

Supporting self-regulated learning with learning analytics interventions – a systematic literature review

verfasst von: Sami Heikkinen, Mohammed Saqr, Jonna Malmberg, Matti Tedre

Erschienen in: Education and Information Technologies | Ausgabe 3/2023

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

During the past years scholars have shown an increasing interest in supporting students' self-regulated learning (SRL). Learning analytics (LA) can be applied in various ways to identify a learner’s current state of self-regulation and support SRL processes. It is important to examine how LA has been used to identify the need for support in different phases of SRL cycle, which channels are used to mediate the intervention and how efficient and impactful the intervention is. This will help the learners to achieve the anticipated learning outcomes. The systematic literature review followed PRISMA 2020 statement to examine studies that applied LA interventions to enhance SRL. The search terms used for this research identified 753 papers in May 2021. Of these, 56 studies included the elements of LA, SRL, and intervention. The reviewed studies contained various LA interventions aimed at supporting SRL, but only 46% of them revealed a positive impact of an intervention on learning. Furthermore, only four studies reported positive effects for SRL and covered all three SRL phases (planning, performance, and reflection). Based on the findings of this literature review, the key recommendation is for all phases of SRL to be considered when planning interventions to support learning. In addition, more comparative research on this topic is needed to identify the most effective interventions and to provide further evidence on the effectiveness of interventions supporting SRL.
Hinweise

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

1 Introduction

Self-regulated learning (SRL) is often described as a process that includes the setting of goals, the monitoring of progress, and the regulating of learning (Järvelä et al., 2019; Pintrich, 2000). Empirical studies have continuously demonstrated that students do not use efficient self-regulating strategies when learning in virtual environments (Akyildiz & Kaya, 2021; Azevedo et al., 2019; Broadbent & Poon, 2015; Pedrotti & Nistor, 2019). Therefore, learners may not achieve the expected learning outcome (Lee et al., 2019; Shyr & Chen, 2018; van Alten et al., 2020). This situation has motivated both teachers and researchers to design ways to support it.
Panadero (2017) reviewed six different SRL models and concluded that a common feature of the models is the division of SRL into three phases: planning, performance, and reflection. This three-phase model was initially introduced by (Puustinen & Pulkkinen, 2001). Although the SRL models reviewed by Panadero share commonalities, they vary depending on their focus. For example, Boekaerts and Corno’s (2005) model focuses on motivational aspects of regulated learning, while Efklides’s (2011) model focuses on metacognitive aspects of regulated learning, such as metacognitive experiences of students’ learning regulation. Winne and Hadwin (1998) built their model based on information processing theory and focused on the effective use of cognitive strategies to enhance learning. Hadwin et al.’s (2011) model not only considers individual regulation, but also the ways regulation of learning can be shared between group members. Finally, the models of Pintrich (2000) and Zimmerman (2000) can be described as general models of regulated learning since they consider motivation and metacognition to be central to SRL.
SRL phases comprise several subprocesses that differ by model but are still conceptually close. For instance, Boekaerts and Corno’s (2005) model includes identification, interpretation, primary and secondary appraisal, and goal setting in the planning phase of SRL. Efklides’s (2011) model includes task presentation in this phase. Hadwin et al. (2011) have claimed that planning is included in this phase. Pintrich (2000) has stated forethought, planning, and activation belonging to this phase. Winne and Hadwin (1998) have allocated task definition, goal setting, and planning to this phase. Finally, Zimmerman (2000) has located forethought (task analysis and self-motivation) in the planning phase of SRL. These subprocesses identified by the researchers are actions learners take at the early phase of the learning process, and they involve estimating how the learning task can be accomplished.
After planning, the learner continues to the performance phase and implements their planned actions. Depending on the model used, the subprocesses of this phase may include goal-striving (Boekaerts & Corno, 2005), cognitive processing, performance (Efklides, 2011), monitoring, control (Hadwin et al., 2011; Pintrich, 2000), applying tactics and strategies (Winne & Hadwin, 1998), and performance (with self-control and self-observation Zimmerman, 2000). After the performance phase, it is time to reflect on the learning process. Within this reflection phase, subprocesses may include performance feedback (Boekaerts & Corno, 2005), regulating (Hadwin et al., 2011; Pintrich, 2000; Winne & Hadwin, 1998), reaction and reflection (Pintrich, 2000), adapting metacognition (Winne & Hadwin, 1998), and self-reflection (including self-judgement and self-reaction; Zimmerman, 2000). According to Efklides’s (2011) model, the reflection phase includes no explicit subprocesses.
Many studies have suggested that SRL could be enhanced by interventions (Ceron et al., 2021; Lee et al., 2019; Wong et al., 2019a). Interventions can assist at-risk learners in improving their learning performance (Espinoza & Genna, 2021) or can help reduce undesirable learning behaviour (Lodge et al., 2019). Furthermore, interventions can be used to increase students’ mastery goals (Zheng et al., 2020), self-efficacy (Samuel & Warner, 2021), and perceived value of a course (Alamri et al., 2020). At the same time, this intervention has been found to decrease test anxiety among students (Putwain & von der Embse, 2021). However, numerous SRL subprocesses need different kinds of interventions performed at the right time of the learning process. These needs set the requirements for methods used to create effective interventions.
The rest of this paper follows the following organization. In the second section, the way how LA is applied to enhance SRL is explained. The previous reviews focusing on the themes of this review are explained in the third section. This is followed by the fourth section, where we explain the method used to conduct this review. In the fifth section, we report the results of our study, which is followed by the sixth section discussing the results. Finally, the last section concludes this article.

2 Learning analytics as a tool to support SRL

Learning analytics (LA) is “the measurement, collection, analysis, and reporting of data about learners and their contexts for the purposes of understanding and optimising learning and the environments in which it occurs” (Siemens, 2013 p. 1382), whereas LA methods are referred in this paper as a set of methods used to support the measuring, collecting, analysing and reporting the data. The reporting of the data is mediated to learners in different means which are referred as channels.
LA can be used to analyse and report learner performance and to highlight the parts of learning where SRL processes could be improved (Lodge et al., 2019; Winne, 2022). LA for SRL has two elements, calculation and recommendation (Winne, 2017). These two elements are built within the LA model's four stages: data generation, tracking, analysis, and action (Romero & Ventura, 2020). Data is generated when the learner does different activities in an online learning environment. The trace data from the activities can be used as an input for calculation, analysis, and tracking of the learner’s process. Next, the action stage is when the instructor executes interventions to support learners’ activities, such as SRL subprocesses. These LA interventions support changes a learner can take in their learning process to achieve better outcomes (Ifenthaler & Yau, 2020).
LA intervention is a surrounding frame of activity used to mediate data and reports created with the help of analytics (Wise, 2014). LA dashboard is a typical channel used for LA intervention (Jivet et al., 2018) and according to Sønderlund et al. (2019) most interventions are aimed to identify at-risk students (Shafiq et al., 2022). Sønderlund et al. (2019) have stated that early LA methods relied on fixed factors and could populate predictions only within limited settings, whereas contemporary dynamic models can be used in more complex learning contexts. However, Rienties et al. (2017) have argued that implementing an intervention remains the challenging part of the LA cycle, and the design of intervention should be well established to succeed in its endeavour to support learning (Sedrakyan et al., 2020; Wise, 2014).

3 Previous literature reviews

A number of literature reviews have already mapped the state of the art in research on SRL, LA, and interventions. Bodily and Verbert (2017) studied the LA dashboards and recommender systems used to intervene in students’ learning processes. However, this study did not include the perspective of SRL. Another study focusing on LA dashboards was conducted by Matcha et al. (2020) who studied the LA dashboard interventions aimed to support SRL. This study did not include the different channels besides the LA dashboards. Valle et al. (2021) have studied learner-facing LA dashboards and found out that the intended outcomes of LA dashboards do not meet with their evaluation in terms of effective measures.
The literature review by Pérez-Álvarez et al. (2018) explored tools supporting SRL in online environments. Their study found that most of the tools were used in higher education or general education. The most common means of supporting SRL the researchers observed was visualisation, and the most used indicators were either action-related or content-related. The most supported SRL strategies were goal setting, self-evaluation, help-seeking, and organisation. The studies examined by the researchers rarely explained the relationships between SRL strategies and tools. Their review found that studies measure the value of the tools in terms of usability (easiness of usage), usefulness (appropriateness of use), user satisfaction (acceptance by stakeholders), and learning outcomes.
Wong et al., (2019a) reviewed 35 records supporting SRL in online learning environments and MOOCs. Their review studied the effectiveness of approaches to support SRL strategies and whether human factors influenced the approaches. The study did not include LA. The most used intervention types to support SRL identified by the researchers were prompts (14 studies), integrated support systems (10 studies), and prompts with feedback (four studies). The rest of the reviewed studies used different, rarely used approaches. Despite the investigation, how well different approaches support SRL remained unclear.
Sønderlund et al. (2019) explored the efficacy of LA interventions in higher education but did not include the perspective of SRL within its scope. Specifically, they reviewed studies examining LA interventions’ efficacy for student retention and academic success. The results from the research they found were promising, as the studies identified that LA interventions increased grades and led to higher retention. However, the researchers found only few studies critically assessing the effectiveness of LA interventions for student success and retention.
A literature review by Viberg et al. (2020) focused on studies of SRL and LA in online learning environments. The researchers examined the current state of LA applications and how it measures and supports students’ SRL in online learning. Their results showed that most studies focused on the forethought and performance phase of SRL and focused less on the reflection phase. Viberg et al. (2020) concluded that LA was not used to support SRL in their sample of records. They argued that there was little evidence of LA improving learning outcomes or improving support for learning and teaching. Instead of supporting SRL, they found that LA research was focused on measuring SRL. Viberg et al. (2020) thus recommended LA support mechanisms be exploited to foster student SRL within online learning.
Araka et al. (2020) examined the research trends regarding measurement and intervention tools for supporting SRL in online learning environments between 2008 and 2018. The review findings indicated that current tools simultaneously measure and implement interventions to support SRL. Araka et al. (2020) identified LA as one way to measure the efficiency of interventions. However, the LA methods in use were not extensively explored, and this perspective remains unstudied.
The previous literature reviews have examined LA, SRL, and interventions, but none of the studies connected the three concepts. Researchers have been focusing on tools to support SRL (Pérez-Álvarez et al., 2018) and learning analytics interventions in online education (Sønderlund et al., 2019). Still, combination of these elements remains unstudied. This study connects these concepts for the first time. First, it is important to examine how LA has been used to identify the need for SRL support. Then, the channels to mediate the intervention has to be selected to target the phases of SRL cycle. Finally, the efficiency and the impact of the intervention have to be evaluated to find out how the selection of LA methods and channels for LA interventions foster SRL. This will help the learners to achieve the anticipated learning outcomes (Azevedo & Hadwin, 2005; Lee et al., 2019; Shyr & Chen, 2018; van Alten et al., 2020). The research questions for this research are the following:
  • RQ1: What learning analytics methods have been applied to identify the needs for SRL support?
  • RQ2: What channels have been applied in LA interventions to foster SRL?
  • RQ3: Which phases of the SRL cycle were targeted by LA interventions?
  • RQ4: How do studies evaluate the SRL support efficiency and impact, and what kind of results were achieved?

4 Method

This research followed the PRISMA 2020 statement. PRISMA was chosen because it is widely used and suitable for studies reviewing educational interventions. The PRISMA 2020 is also eligible for systematic reviews including qualitative and quantitative methods, thus allowing a meta-analysis of effect estimates and their variances when they are available (Page et al., 2021.)
We selected a list of keywords (Table 1) to reduce the possibility of reviewer bias. While doing preliminary searches, we found that only a few studies used the exact term LA intervention or intervention to identify the supporting acts for SRL. Instead, these acts were described using various terms or keywords when presented as a study’s experiment. Therefore, we decided to perform a broad search and include all records about LA and SRL. After the records were collected, they were checked manually for the evidence of an intervention since it is hard to find a reasonable keyword or combination of keywords that would ensure that all intervention related records would be captured. To reduce the author bias, we discussed the records in meetings to the unanimous opinions on which records to be included in the review.
Table 1
Systematic literature review search terms
Topic and cluster
Search terms
self-regulated learning
“self-regulat*” OR
“self regulat*” AND
learning analytics
“educational data mining” OR
“learning analytics”
The keywords used to find relevant research identified records focusing on SRL and learning analytics. There are two ways to write self-regulation (i.e., with or without a hyphen), and both ways were used to identify the records including this topic. “Self-regulat*” was used to include all papers that have the keyword “self-regulation”; “self-regulating”; “self-regulated”; “self-regulate”; “self-regulates”; and so forth. The term “self regulat*” was used to include variants of the previous terms written without a hyphen. For learning analytics, we decided to include “educational data mining” and “learning analytics” due to the close relationship between the two often overlapping concepts (see Romero & Ventura, 2020).
The inclusion criteria for the eligible record were the following:
1.
The manuscript reported a learning analytics analysis or an intervention hypothesised to support or enhance learners’ self-regulated learning.
 
2.
The full text was available.
 
3.
The record was written in English.
 
4.
The record was a peer-reviewed empirical paper.
 
5.
The record was published prior to the end of 2021.
 
We conducted comprehensive searches on Scopus (390 documents) and Web of Science (164 documents) in May 2022. Additionally, we used Google Scholar to search for records published by the Journal of Learning Analytics (142), the Journal of Educational Data Mining (35), and the International Conference on Educational Data Mining (22). The Journal of Learning Analytics, Journal of Educational Data Mining, and International Conference of Educational Data Mining are not fully indexed in the Scopus or Web of Science databases. Therefore, we checked these publication channels manually because they have been identified as having the most specific focus on LA and EDM (Romero & Ventura, 2020), and we expected to find studies focusing on the topic of the review.
We carried out the selection strategy and the literature review process (Fig. 1) following the inclusion criteria. The records obtained from the search were reviewed in three phases. In every phase, authors had meetings to discuss the records to find the unanimous opinions of whether to include or exclude the records. The first phase of examination focused on the title and the abstract of the studies, and the first author of the current review excluded records not meeting the inclusion criteria based on these elements. In the second phase, the first author read the full texts. If a record did not include the three elements of the research question, it was excluded. There were 47 records left (see Appendix Table 7) after phase two. For the last phase, the first author examined each record in detail and recorded the information from their studies. The first and the second authors met repeatedly to agree on a coding scheme. Then the first author and the second author coded papers independently, followed by a meeting to reach an agreement regarding the disagreements and after consensus, the first author completed the coding. Coding was used to describe the details of the studies and enable mapping of the LA methods, the interventions applied, the phases of SRL that were focused on, the evaluation methods used to assess the outcome of LA interventions, and the impact of a study’s intervention methods on learning process or learning outcome. Table 2 shows the codification summary of the research questions.
Table 2
Codification by research question (1 – 4)
RQ1: What learning analytics methods have been applied to identify the needs for SRL support?
Coding
Categories
LA methods
Causal mining, clustering, discovery with models, distillation of data for human judgment, knowledge tracing, non-negative matrix factorisation, outlier detection, prediction, process mining, recommendation, relationship mining, statistics, social network analysis, text mining, visualization
RQ2: What channels have been applied in LA interventions to foster SRL?
Coding
Categories
Channels used for intervention
Learning analytics dashboard (LAD), embedded, prompts, phone call, text message, e-mail
RQ3: Which phases of the SRL cycle were targeted by LA interventions?
Coding
Categories
SRL phase
Planning, performance, reflection
RQ4: How do studies evaluate the SRL support efficiency and impact, and what kind of results were achieved?
Coding
Categories
Form of impact
Improved self-regulation, better learning outcomes, engagement with course materials, improved course completion, improved student retention
We synthesised the results of previous studies to answer research questions and reported the findings. There are diverse LA methods that have been used to measure, collect, analyse, and report the data about learners to understand and optimise learning (Siemens, 2013). The categorisation of each record’s methods in this literature review followed the categories presented by Romero and Ventura (2020). They have suggested division of the methods into the following categories: causal mining (CM), clustering (C), discovery with models (DM), distillation of data for human judgement (DD), knowledge tracing (KT), non-negative matrix factorisation (NMF), outlier detection (OD), prediction (P), process mining (PM), recommendation (R), relationship mining (RM), statistics (S), social network analysis (SNA), text mining (TM), and visualisation (V). Almost all the methods of the records were identified and could be categorised using the definitions of Romero and Ventura. Two studies (Wise et al., 2014, 2016) applied methods that did not fit into these categories. Within these studies there are integrated analytics used to express learners the expected quantity, quality, and timing of discussions (Wise et al., 2014) and a conceptualization including LA as a part of SRL process (Wise et al., 2016). Therefore, we allocated these records into the category of other methods (O).
We coded the SRL subprocesses to phases: planning, performance, and reflection following the division presented by Panadero (2017). This was done by examining the full texts using “subprocesses” as a search term to identify which subprocesses had been investigated in the studies and organised the subprocesses under the three phases of SRL.
The impact of LA interventions can be measured in several ways, including retention, course completion, and dropout rates. LA Interventions that promote a better retention rate improve the learner’s continuous study in a course (e.g., Freitas et al., 2015), whereas LA interventions that create greater completion rates reflect learners’ successful completion of individual courses and thesis processes (e.g., Nouri et al., 2019). A reduced dropout rate from an LA intervention results in fewer learners failing to finish coursework (e.g., Cambruzzi et al., 2015). Other positive effects can also include better engagement with a course (Chen et al., 2020), usage of learning management system (LMS) resources (e.g., Jovanović et al., 2021), and eventually improved course grades (e.g., Wong, 2017). Finally, the outcome from a LA intervention can be seen in form of enhanced SRL processes (e.g., Roll & Winne, 2015). For the analysis, we considered these aspects as measurable impacts of LA interventions.
The reviewed studies used different effect sizes to measure the impact of their LA interventions. Due to insufficient number of papers with similar research settings, we were not able to execute proper meta-analysis. Instead, we decided to use both qualitative and quantitative studies to illustrate what kind of outcomes are achieved in the reviewed studies. To prevent bias due to differences in how reviewed records analysed different tasks, procedures, methods, and effect sizes, we did not convert the effect sizes to create an equal scale (e.g., Cohen’s d Cohen, 1992). Instead, we reported the effect sizes as presented in the original records. To enable the comparison between different effect sizes, we used the interpretation guidelines given by (Maher et al., 2013) to synthesise the results. The results of this approach are presented in Table 3. There are other effect sizes (hazard ratio, β, F, and X2) used in the studies, but they were not comparable with the interpretation approach. These effect sizes are displayed in the results without interpretations.
Table 3
Effect sizes and their interpretation
 
Effect sizes
Small
Medium
Large
Very large
Cohen’s d
0.20
0.50
0.80
1.30
correlation
0.10
0.30
0.50
0.70
η2
0.01
0.06
0.14
-

5 Results

There were 56 studies included in the final analysis. The first studies about designing LA interventions were conducted by (Bouchet et al., 2013) and (Inventado et al., 2013). In their work, the researchers of these studies formed clusters of students, characterised the students belonging to different clusters (Bouchet et al., 2013), and identified effective long-term learning behaviours (Inventado et al., 2013). Interest in the topic has increased since 2013, and nine studies from 2021 focusing on the topic were found (Fig. 2). These more recent studies focus on implementing and evaluating the tools designed for LA interventions. The reviewed studies primarily examined higher education settings (46 studies; 82% of studies) with undergraduate or graduate students. Less frequently, the studies concerned doctoral students (two studies; 4%), workplaces (three studies; 5%), or public schools (one study; 2%). In (Jivet et al., 2021) there were higher education degree students and students in professional development process included. Four studies did not report the population that they researched. The sizes of the samples involved in the studies varied. Some studies had less than 10 participants (Cha & Park, 2019; Inventado et al., 2013; Wise et al., 2014, 2016), whereas the largest population was made of 33,726 learners (Davis et al., 2017). The mean population of the reviewed studies was 1,398 participants and the median was 102 participants. Most studies used single courses as their learning context (27 studies; 48%), but some studies conducted their LA intervention across nine (Hardebolle et al., 2020) or ten (Haynes, 2020; Kia et al., 2020) courses. This trend of aiming to generalise the LA interventions is relatively recent.
There were different phases and perspectives in the development of the LA interventions. Some studies explained the early phases of their LA intervention design from a technological point of view (Frey et al., 2016; Manso-Vázquez & Llamas-Nistal, 2015; Siadaty et al., 2016b; Winne et al., 2019). Other studies examined the next phases with the perspective of user’s experience on the interface of the intended LA intervention (Cha & Park, 2019; Rohloff et al., 2019; Sonnenberg & Bannert, 2016), and the rest of the studies tested the effectiveness of the LA intervention in action.

5.1 RQ1: What learning analytics methods have been applied to identify the needs for SRL support?

While some of the studies (n = 14; 25%) focused on supporting SRL with one LA method, it was more common to see a study use two (n = 16; 29%) or three (n = 18; 32%) methods. Some scholars even used four (Jivet et al., 2020; Matcha et al., 2019; Russell et al., 2020) or five methods (Afzaal et al., 2021; Sonnenberg & Bannert, 2015, 2016, 2019; Yu et al., 2018) to achieve their goals. The methods used in the reviewed studies are summarised in Table 4. The first column shows the learning analytics methods. As multiple methods can be applied, the overall number of learning analytics methods surpassed the number of reviewed studies. The number of studies using a method is displayed in the second column.
Table 4
Summary of methods used in reviewed studies
Learning analytics methods
Number of studies
Visualisation
32 (57%)
Statistics
32 (57%)
Recommendation
16 (29%)
Prediction
12 (21%)
Discovery with models
8 (14%)
Knowledge tracing
7 (13%)
Clustering
7 (13%)
Process mining
6 (11%)
Relationship mining
6 (11%)
Non-negative matrix factorisation
2 (4%)
Outlier detection
2 (4%)
Other methods
2 (4%)
Causal mining
1 (2%)
Distillation of data for human judgement
1 (2%)
Social network analysis
1 (2%)
Text mining
1 (2%)
Statistical approach was often used to provide content for visualisations (n = 19; 34%). Four studies used a combination of the three most used methods (Balavijendran & Burnie, 2018; Jivet et al., 2020; Siadaty et al., 2016a; Yu et al., 2018). In these studies, a learner’s trace data was analysed using statistical methods and compared to their peer learners. The analysis result was then visualised for the learner and accompanied by recommendations on how to improve their learning processes.

5.2 RQ2: What channels have been applied in LA interventions to foster SRL?

Table 5 presents the different channels used for LA interventions. The first column displays the different channels used for LA interventions and the second column shows the number of studies applying these channels.
Table 5
Summary of channels used in reviewed studies
Channels
Number of studies
Learning analytics dashboard
27 (48%)
Embedded in LMS
12 (21%)
Prompt
11 (20%)
E-mail
5 (9%)
Foldout
2 (4%)
Phone call
1 (2%)
Text message
1 (2%)
An LAD displays an LA intervention using a visual form. Embedded systems may include elements directly connected to the learning materials or assignments. For example, (Bouchet et al., 2013) used a virtual pedagogical assistant to give hints, and (Frey et al., 2016) implemented a system that helped learners rehearse question-asking skills for question-based dialogue. Prompting can provide directions to a learner at different parts of the learning process and help them to improve their learning.
The LA interventions mentioned used automation. However, there were also LA interventions where the amount of individual human touch was high. These LA interventions include e-mails (Lim et al., 2021; Lim et al., 2020; Matcha et al., 2019; Menchaca et al., 2018; Nikolayeva et al., 2020), foldouts (Ott et al., 2015; Sedraz Silva et al., 2018), phone calls, and text messages (Herodotou et al., 2020). E-mail is an excellent tool for providing individual guidance and helping a student develop their learning process (Matcha et al., 2019). Foldouts (Ott et al., 2015; Sedraz Silva et al., 2018) also provide learners with information about their learning process.

5.3 RQ3: Which phases of the SRL cycle were targeted by LA interventions?

It is most common to foster the SRL subprocesses of the performance phase (e.g., giving hints on how to approach assignments). In total, 44 studies (79%) focused on the performance subprocesses. 25 studies (45%) targeted the subprocesses of the planning phase (e.g., setting goals, time management), and 17 studies (30%) targeted subprocesses of the reflection phase (e.g., evaluation learning outcomes and identifying how to improve performance). Five studies (9%) did not specify the subprocesses they focused on, and therefore those studies could not be categorised. Figure 3 presents the division of the studies into different phases.
Notably, there were only nine studies (16%, Bouchet et al., 2013; Hardebolle et al., 2020; Jivet et al., 2021; Lallé et al., 2017; Lim et al., 2021; Nikolayeva et al., 2020; Siadaty et al., 2016b; Wise et al., 2014; Wong et al., 2019b ) that focused on the entire SRL cycle. Manganello et al. (2021) applied the all the phases of 4C model, which is used in professional contexts. Connecting only two phases was more common. For example, studies typically connected the performance with either the planning phase (12 studies; 21%) or with the reflection phase (four studies; 7%). Connecting the planning and reflection phases was only done by (Lu et al., 2017).

5.4 RQ4: How do studies evaluate the SRL support efficiency and impact, and what kind of results were achieved?

The studies evaluated the efficiency of an LA intervention in various ways. Studies focused on their LA intervention design tended to use qualitative methods, such as interviews, the think-aloud method, and focus groups, to verify the usability, usefulness, and learners' perception of an LA intervention. Interviews (Cha & Park, 2019; Corrin & de Barba, 2014; Wise et al., 2014, 2016) were used to discover how learners perceived the functionalities and features of LA interventions and to find approaches that learners would be willing to use in their learning processes. Sonnenberg and Bannert (2016, 2019) used the think-aloud method to gain insight into these learner perceptions and to identify how useful a design was for a learner. In other cases, some researchers used focus groups (Balavijendran & Burnie, 2018; Lim et al., 2021; Lim et al., 2020), observations of facial expressions (Bouchet et al., 2013), eye-tracking (Bouchet et al., 2013; Lallé et al., 2017; Munshi & Biswas, 2019), and physiological sensors (Lallé et al., 2017) to evaluate the effects of an LA intervention during its early developmental phase.
Quantitative methods were used by researchers to evaluate the efficiency of an LA intervention during the execution phase. 34 studies (61%) used trace data for this purpose, making it the most used method. Pre- and post-intervention surveys were popular evaluation methods as well. There were 23 studies (41%) that used surveys, either alone or with trace data, to evaluate the efficiency of a LA intervention. The actual outcomes of participants’ learning processes were evaluated in terms of grade distribution (Afzaal et al., 2021; Cody et al., 2020; Kia et al., 2020; Lim et al., 2021; Lu et al., 2017; Manganello et al., 2021; Ott et al., 2015; Sedrakyan & Snoeck, 2017; van Horne et al., 2018) and course completion (Davis et al., 2017; Herodotou et al., 2020).
26 studies (46%) reported a positive, measurable impact on learning from their LA intervention. In Table 6, the impact of the LA interventions and the measurements of the impacts are displayed. In the first column, the types of impact are denoted. The second column identifies the studies, and the third column shows the impacted variable. The fourth column presents the effect sizes measured in the studies, and the last column interprets effect size following the principles introduced in Table 2.
Table 6
LA intervention impacts and the measurements of impact
The type of impact
Study
Impacted variable
Effect size
Interpretation
Improved self-regulation
Aguilar et al., 2021
Improved SRL
NA
 
Balavijendran & Burnie, 2018
Perceived improvement in SRL
NA
 
Chen et al., 2020
Regulation of cognition
β = 0.393
 
Corrin & De Barba, 2014
Perceived better progress during the performance phase
NA
 
Domínguez et al., 2021
Self-assessment
NA
 
Lim et al., 2020
Goal setting and procrastination
NA
 
Lim et al., 2021
Improved SRL
NA
 
Lim et al., 2021
Improves SRL
ηp2 = 0.22
Large
Matcha et al., 2019
High engagement strategy
NA
 
Siadaty et al., 2016c
Task analysis
Goal setting
Personal planning
Working on task
Applying strategy changes
Evaluation
Reflection
r = 0.667
r = 0.778
r = 0.740
r = 0.781
r = 0.745
r = 0.685
r = 0.682
Large
Very large
Very large
Very large
Very large
Large
Large
Sonnenberg & Bannert, 2015
Metacognitive learning activities
d = 0.44
Small
Sonnenberg & Bannert, 2016
Metacognitive learning activities
d = 2.00
Very large
Sonnenberg & Bannert, 2019
Task analysis
Evaluation
ηp2 = 0.082
ηp2 = 0.063
Medium
Medium
Better learning outcomes
Afzaal et al., 2021
Improvement in grades
r = 0.86
Very large
Cody et al., 2020
Faster performance
F = 5.24
 
Davis et al., 2016
Improvement in grades
NA
 
Lim et al., 2021
Improvement in grades
d = 0.47
Small
Lu et al., 2017
Improvement in grades
F = 19.71
 
Manganello et al., 2021
Improvement in grades
d = 1.10
Large
Menchaca et al., 2018
Defining targets
Task definition
Communication and collaboration
d = 0.90
d = 0.78
d = 0.35
Large
Medium
Small
Schumacher & Ifenthaler 2021
Declarative knowledge
NA
 
Van Horne et al., 2018
Improvement in grades
NA
 
Better engagement with course materials
Davis et al., 2016
Timely submissions and number of submissions
NA
 
Nikolayeva et al., 2020
More completed quizzes
d = 0.31
Small
Siadaty et al., 2016a
Perceived learning paths useful
NA
 
Wong et al., 2019b
Increased interaction with activities
NA
 
Improved course completion
Davis et al., 2017
Improved completion rate
χ2 = 5.87
 
Russell et al., 2020
Lower dropout rate
HR = 0.74
 
Improved student retention
Herodotou et al., 2020
Improved course completion
NA
 
Not promising
Bouchet et al., 2013
Effect on SRL
NA
 
Haynes, 2020
Perceived usefulness of the LA intervention
NA
 
Kia et al., 2020
The connection between SRL and academic achievement
χ2 = 2.43
 
Li et al., 2017a
Improvement on dropout rates
NA
 
Li et al., 2017b
Learning effectiveness
NA
 
Ott et al., 2015
Attendance
Submission rates
Exam performance
NA
 
Tabuenca et al., 2021
Time management
NA
 
In 13 studies (23%), the impact of an LA intervention was seen in the form of improved self-regulation. Nine (16%) studies achieved better learning outcomes. LA interventions resulted in better engagement with course materials in four (7%) studies. Davis et al. (2017) and Russell et al. (2020) found improved course completion (4%), and Herodotou et al., (2020; 2%) reported an improvement in student retention in response to their respective LA interventions. Seven studies (13%; Bouchet et al., 2013; Haynes, 2020; Kia et al., 2020; Li et al., 2017a, b; Ott et al., 2015; Tabuenca et al., 2021) claimed the outcomes they observed were not promising or stated there were significant limitations for practical implications. Four (7%) studies (Frey et al., 2016; Manso-Vázquez & Llamas-Nistal, 2015; Munshi & Biswas, 2019; Siadaty et al., 2016b; Winne et al., 2019) were not able to evaluate the impact of their LA intervention (e.g., due to the early stage of development of the LA intervention).
The reporting of effect sizes varied among studies. There were in total 16 (29%) studies where effect sizes were reported. Very large effect sizes were found by Afzaal et al. (2021) regarding the grades, by Sonnenberg and Bannert (2016) the number of metacognitive learning activities (d = 2.0) and by Siadaty et al. (2016b) for the subprocesses of goal setting (r = 0.778), personal planning (r = 0.740), working on task (r = 0.781), and applying strategy changes (r = 0.745). A Large effect sizes were found by Manganello et al. (2021) on improvement in grades (d = 1.10), Lim et al. (2021) on SRL improvement (ηp2 = 0.22), Siadaty et al. (2016b) on task analysis (r = 0.667), evaluation (r = 0.685), reflection (r = 0.682), and by Menchaca et al. (2018) on defining targets (d = 0.90;). The LA intervention of Sonnenberg and Bannert (2019) had a medium effect on evaluation (ηp2 = 0.063) and task analysis (ηp2 = 0.082). Small effect sizes were reported for grades (d = 0.47, L. A. Lim et al., 2021), prompts (d = 0.44, Sonnenberg & Bannert, 2015), communication and collaboration (d = 0.35, Menchaca et al., 2018), and the number of quizzes completed (d = 0.31, Nikolayeva et al., 2020).

6 Discussion

The goal of this paper was to systematically review how LA has been applied to enhance SRL in. Many of the reviewed studies achieved promising results. Many others, however, were not able to show an impact on learning. Despite the promising results, it is still unclear whether there are features that could predict the success of a LA intervention. Some studies report how learners perceive the impact of intervention, whereas some studies report the changes in the students’ learning process or learning outcomes. The previous studies have shown, that even though students’ self-evaluation would suggest the improvement in SRL, the impact seen in trace data, doesn’t necessarily show similar improvement due to e.g., response styles or confidence measures (Tempelaar et al., 2020).
Almost two-thirds of the studies reviewed applied two or three LA methods. Nevertheless, the increase in the number of applied LA methods does not seem to improve the effectiveness of a LA intervention. For instance, (Afzaal et al., 2021; Sonnenberg & Bannert, 2015, 2016, 2019; Yu et al., 2018) applied five LA methods within all their studies, and the effectiveness of their LA interventions ranged from small to very large. If the number of LA methods is not the key for a successful LA intervention, then perhaps the combination of methods could be. Four studies used a combination of the three common LA methods: statistics, visualisation, and recommendation (Balavijendran & Burnie, 2018; Jivet et al., 2020; Siadaty et al., 2016a; Yu et al., 2018). A positive impact was reported only by Balavijendran and Burnie (2018), whose students reported that LA interventions improved their learning. In contrast, the studies of (Jivet et al., 2020; Siadaty et al., 2016a; Yu et al., 2018) were not able to find an impact on learning, even though they included the four stages of the LA cycle (Khalil & Ebner, 2015; Siemens, 2013) and used a combination of methods.
LAD was the most used LA intervention among the reviewed studies, though only 43% of the LAD interventions positively impacted learning. Success rates suggest that the implementation of LAD is not sufficient to support SRL. Instead, how LAD should be used seems to be an unknown element affecting SRL. This can be due to the absence of support in terms of dashboard design from learning sciences with regard to core concepts of learning processes and feedback mechanisms (Sedrakyan et al., 2020). The features and characteristics of LADs differ (Bodily & Verbert, 2017) and the effective elements should be considered as principles of LAD design. A closer look should also be put on the different aspects of e.g., demographics of the sample, duration of the study and the studied subjects to find underlying variables affecting the unknown elements of LAD success.
Only nine studies focused on the entire SRL cycle. These findings are similar to those of Viberg et al. (2020), who found that the studies they reviewed focused less on the reflection phase. Araka et al. (2020) identified LA as one way to measure the effectiveness of LA interventions but found that the methods used for this purpose remained unexplored. Furthermore, Viberg et al. (2020) has argued that there is little evidence of LA improving learning outcomes or learning support. In this current study, only four out of the nine studies reported positive impacts on SRL (Lim et al., 2021; Nikolayeva et al., 2020; Siadaty et al., 2016b; Wong et al., 2019b). Channels used for LA intervention and the LA methods differed studies. Nikolayeva et al. (2020) and Lim et al. (2021) performed their LA intervention using e-mail. Siadaty et al. (2016b) used LAD to give recommendations to learners. Nikolayeva et al., Siadaty et al., and Lim et al. applied statistical methods to support their LA interventions. Wong et al., (2019b) used a discovery with methods accompanied by relationship mining and visualisation to prompt learners. The common feature for all four studies can be seen in learning outcomes: all interventions increase the learning actions e.g., working on task (Siadaty et al., 2016b), completed quizzes (Nikolayeva et al., 2020), interaction with activities (Wong et al., 2019b), and improved grades (L. A. Lim et al., 2021). Lim et al. (2021) was the only study that showed an impact in improved SRL. This is a promising result but requires further studies to verify results in different study contexts and with larger sample sizes. The results suggest that intervention focusing on planning and reflection can be seen in the outcomes of the performance phase. However, the limited number of studies focusing on the entire SRL cycle, sets a need to design more interventions covering the entire SRL cycle and to execute comparative research to find the most effective interventions as a basis for intervention design principles.
The reviewed studies reported the impact of their LA interventions on learning in various ways. Both qualitative and quantitative methods were used at different stages of an LA intervention’s development. While 26 studies (46%) reported a positive impact on learning, only 14 studies reported enough details to make the results interpretable and enable proper comparison. A similar problem was found by Sønderlund et al. (2019). In their review of research, only three studies critically assessed the effectiveness of LA interventions for student success and retention. The research of (Sønderlund et al., 2019) explored the efficacy of LA interventions in higher education, and they evaluated LA interventions from the perspectives of student retention and academic success, which is a rather narrow perspective for evaluating the impact of a LA intervention on learning. The current study broadened the framework of impact by adding the aspects of improved SRL processes and engagement with the course materials. This helps the future studies to assess the impact of LA interventions.
Different designs of LA interventions have different impacts on learning. For example, the LA interventions by (Siadaty et al., 2016b) and (Sonnenberg & Bannert, 2019) were aimed at supporting the subprocess of task analysis in SRL. The former study found their LA intervention had a large impact on learning, whereas the latter found it had a medium impact. It is also interesting to note that an initial LA intervention can be further developed, thus leading to an improved impact. This kind of development can be seen in the consecutive studies by (Sonnenberg & Bannert, 2015, 2016), where the initial impact was improved from small to very large across their studies.
We argue that a successful LA intervention design requires the LA cycle to be supplemented with a sound pedagogical approach to provide feedback for both cognitive and behavioural processes (Sedrakyan et al., 2020). The initial LA intervention must be developed iteratively, as was done by Sonnenberg and Bannert (2015, 2016) to test different approaches and functionalities in order to find the most effective ways to enhance SRL. In SRL, regulation can focus on the cognitive, emotional, and motivational aspects of the learning process. In addition, regulation can also focus on different phases, such as planning, performance, or reflection. This is to say that it would not be reasonable to expect a LA intervention to support all phases simultaneously unless the subprocesses included in the different phases are considered during the design of an intervention. Also, LA interventions are typically conducted for a short period, despite the fact that only long-term LA interventions can potentially contribute to the development of SRL skills. The longevity of training and support must be considered during the design of a LA intervention to enable the students to master SRL skills (Zimmerman, 2000). Furthermore, research should explore the personalized ways of studying and supporting self-regulation i.e., idiographic methods. Idiographic methods rely on several data points collected from individual students to create precise insights based on student’s own data e.g., (Saqr & López-Pernas, 2021). With the rise of multimodal learning analytics, research can tap into the fine grained within-person process and emotions and therefore, deliver highly personalized just-in-time insights and support (Malmberg et al., 2022; Törmänen et al., 2022).

6.1 Validity threats

The validity assessment includes the aspects of internal, external, construct, and conclusion validity (Zhou et al., 2016). In this review, we reduced the risk of validity threats by following the recommended mitigation actions proposed by (Ampatzoglou et al., 2019). Inappropriate or incomplete search terms can be a threat to construct validity and internal validity. LA interventions can be done in various ways, and the concept of LA intervention might not be used in some records. Within this study, LA intervention was not used as a keyword. Instead, we identified LA interventions qualitatively. The same applies to SRL. There were records screened focusing on interventions and learning, but SRL was not used as a concept in these records. These records may have been excluded even though the interventions could have been beneficial for learning. Therefore, there might be records that were relevant to this review, but they were not included because they used keywords not identified within this research. Incomplete research information in primary studies causes an external threat to validity. We have been mitigating this risk by identifying the missing information in Table 6. To reduce the risk of threat to conclusion validity, we have removed the duplicate studies. The discussions among authors have been used aiming for unanimous consensus to reduce subjective interpretations thus mitigating to internal validity and conclusion validity.

7 Conclusion

We have reviewed how studies use LA methods to create LA interventions supporting SRL. Based on the review, neither the number, popularity, nor combination of LA methods implemented in a learning environment provide a guarantee for improved SRL. Moreover, none of the LA intervention channels has shown proof of success in supporting SRL. Only seven studies covered all phases of SRL, and only 12 studies assessed the effect sizes of the LA interventions. Out of these studies, only three focused on the entire SRL cycle and showed results that LA interventions positively impacted learning. Furthermore, the LA methods and the LA intervention channels of the three studies varied. Therefore, we cannot say which combination of LA methods and LA intervention channels would ensure the successful support of SRL.
Interpretable and comparable LA intervention results would help scholars and practitioners to find LA interventions that are the most beneficial for SRL. Future research needs to focus on the pedagogical approach and how it may be used to create impactful LA interventions. Second, LA interventions should focus on different phases and processes of regulated learning. Future research should find the most impactful LA interventions capable of supporting different SRL phases and subprocesses. Third, there are a limited number of LA interventions covering all three SRL phases. Future research and LA intervention design should note the aspects of all SRL phases to support holistic learning. Finally, there are limitations related to how the reviewed studies reported their findings. To improve the interpretability and comparability of an LA intervention design, future research needs to put focus on providing comparable data with larger sample sizes to enable the comparison of different kinds and types of LA interventions.

Acknowledgements

The research reported in this paper was supported by the LAB University of Applied Sciences’ LAB-D program.
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Anhänge

Appendix

Table 7
Table 7
Summary of the reviewed studies
Publication
RQ1: LA Methods used
RQ2: Type of intervention
RQ3: SRL Phases
RQ4: Impact
Afzaal et al., 2021
P, DM, S, R, V
LAD
PER
Better learning outcomes
Aguilar et al., 2021
P, S, V
LAD
PER
Improved self-regulation
Arnold et al., 2017
R, V
LAD
PER
No impact
Balavijendran & Burnie, 2018
R, S, V
LAD
PER + REF
Improved self-regulation
Bouchet et al., 2013
C, KT, S
EMB, PRO
PLA + PER + REF
No impact
Cha & Park, 2019
V
LAD
PER
No impact
Chen et al., 2020
KT, S, V
LAD
PER
Improved self-regulation
Cody et al., 2020
P, R
EMB
PER
Better learning outcomes
Corrin & de Barba, 2014
V
LAD
PER
Improved self-regulation
Davis et al., 2016
OD, S, V
LAD
PLA + PER
Better learning outcomes, engagement with course materials
Davis et al., 2017
OD, S, V
LAD
PER
Improved course completion
Domínguez, 2021
C
EMB
REF
Improved self-assessment
Frey et al., 2016
V
EMB
REF
No impact
Hardebolle et al., 2020
R, V
LAD
PLA + PER + REF
No impact
Haynes, 2020
S, V
LAD
PLA + PER
No impact
Herodotou et al., 2020
P, S
PHONE, TEXT
PLA + PER
Improved student retention
Howell et al., 2018
NMF, S
PRO
PER
No impact
Inventado et al., 2013
DM
N/A
N/A
No impact
Jivet et al., 2020
P, R, S, V
LAD
PLA + PER
No impact
Jivet et al., 2021
P, S, V
LAD
PLA + PER + REF
No impact
Kia et al., 2020
C, RM, V
LAD
PLA + PER
No impact
Lallé et al., 2017
KT, S
PRO
PLA + PER + REF
No impact
Li et al., 2017a
DD, S, V
LAD
PER
No impact
Li et al., 2017b
RM
EMB
N/A
No impact
Lim et al., 2020
R, SNA V
E-MAIL
N/A
Improved self-regulation
Lim et al., 2021
C
EMB
PER
Improved self-regulation
Lim et al., 2021
P, R, S
E-MAIL
PLA + PER + REF
Better learning outcomes, improved self-regulation
Lu et al., 2017
S, V
PRO
PLA + REF
Better learning outcomes, improved self-regulation
Manganello et al., 2021
S, V
LAD
NA
Better learning outcomes
Manso-Vázquez & Llamas-Nistal, 2015
KT
EMB
PLA + PER
No impact
Matcha et al., 2019
C, PM, S, V
LAD, E-MAIL
PER
Improved self-regulation
McKenna et al., 2019
R, V
LAD
PER
No impact
Menchaca et al., 2018
S
E-MAIL
PER
Better learning outcomes
Munshi & Biswas, 2019
DM, PM, RM
EMB
N/A
No impact
Naranjo et al., 2019
S, V
LAD
PER
No impact
Nikolayeva et al., 2020
S
E-MAIL
PLA + PER + REF
Engagement with course materials
Ott et al., 2015
P, S, V
FOLDOUT
PLA + PER
No impact
Rohloff et al., 2019
R, V
LAD
PER + REF
No impact
Russell et al., 2020
DM, P, S, V
LAD
PER
Improved course completion
Schumacher & Ifenthaler, 2021
P, S
PRO
PLA, PER
Improved learning outcomes
Sedrakyan & Snoeck, 2017
CM, PM, S
EMB
PER + REF
No impact
Sedraz Silva et al., 2018
R, S
FOLDOUT
PER
No impact
Siadaty et al., 2016a
R, S, V
LAD
PLA + PER
Engagement with course materials
Siadaty et al., 2016b
R
LAD
PLA + PER
No impact
Siadaty et al., 2016c
R, S
LAD
PLA + PER + REF
Improved self-regulation
Sonnenberg & Bannert, 2015
C, DM, KT, PM, RM
PRO
PLA
Improved self-regulation
Sonnenberg & Bannert, 2016
DM, NMF, PM, S, V
PRO
PLA
Improved self-regulation
Sonnenberg & Bannert, 2019
DM, KT, PM, RM, S
PRO
PLA
Improved self-regulation
Tabuenca et al., 2015
C, S, V
LAD
PER
No impact
Tabuenca et al., 2021
S
PRO
REF
No impact
van Horne et al., 2018
P, S, V
LAD, EMB
PER + REF
Better learning outcomes
Winne et al., 2019
KT, R
EMB
PLA + PER
No impact
Wise et al., 2016
O
LAD, EMB
PER
No impact
Wise et al., 2014
O
PRO
PLA + PER + REF
No impact
Wong et al., 2019b
DM, RM, V
PRO
PLA + PER + REF
Engagement with course materials
Yu et al., 2018
P, R, S, TM, V
N/A
PLA + PER
No impact
Literatur
Zurück zum Zitat Afzaal, M., Nouri, J., Zia, A., Papapetrou, P., Fors, U., Wu, Y., Li, X., & Weegar, R. (2021). Explainable AI for Data-Driven Feedback and Intelligent Action Recommendations to Support Students Self-Regulation. Frontiers in Artificial Intelligence, 4. https://doi.org/10.3389/frai.2021.723447 Afzaal, M., Nouri, J., Zia, A., Papapetrou, P., Fors, U., Wu, Y., Li, X., & Weegar, R. (2021). Explainable AI for Data-Driven Feedback and Intelligent Action Recommendations to Support Students Self-Regulation. Frontiers in Artificial Intelligence, 4. https://​doi.​org/​10.​3389/​frai.​2021.​723447
Zurück zum Zitat Akyildiz, S. T., & Kaya, V. D. (2021). Examining prospective teachers’ metacognitive learning strategies and self-regulated online learning levels during Covid-19 pandemic. International Journal of Contemporary Educational Research, 8(4), 144–157. https://doi.org/10.33200/IJCER.912897 Akyildiz, S. T., & Kaya, V. D. (2021). Examining prospective teachers’ metacognitive learning strategies and self-regulated online learning levels during Covid-19 pandemic. International Journal of Contemporary Educational Research, 8(4), 144–157. https://​doi.​org/​10.​33200/​IJCER.​912897
Zurück zum Zitat Alamri, H., Lowell, V., Watson, W., & Watson, S. L. (2020). Using personalized learning as an instructional approach to motivate learners in online higher education: Learner self-determination and intrinsic motivation. Journal of Research on Technology in Education, 52(3). https://doi.org/10.1080/15391523.2020.1728449 Alamri, H., Lowell, V., Watson, W., & Watson, S. L. (2020). Using personalized learning as an instructional approach to motivate learners in online higher education: Learner self-determination and intrinsic motivation. Journal of Research on Technology in Education, 52(3). https://​doi.​org/​10.​1080/​15391523.​2020.​1728449
Zurück zum Zitat Araka, E., Maina, E., Gitonga, R., & Oboko, R. (2020). Research trends in measurement and intervention tools for self-regulated learning for e-learning environments—systematic review (2008–2018). Research and Practice in Technology Enhanced Learning, 15(1). https://doi.org/10.1186/s41039-020-00129-5 Araka, E., Maina, E., Gitonga, R., & Oboko, R. (2020). Research trends in measurement and intervention tools for self-regulated learning for e-learning environments—systematic review (2008–2018). Research and Practice in Technology Enhanced Learning, 15(1). https://​doi.​org/​10.​1186/​s41039-020-00129-5
Zurück zum Zitat Balavijendran, L., & Burnie, M. (2018). Ready to study: An online tool to measure learning and align university and student expectations via reflection and personalisation. ASCILITE 2018 - Conference Proceedings - 35th International Conference of Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education: Open Oceans: Learning Without Borders, 35–44. Balavijendran, L., & Burnie, M. (2018). Ready to study: An online tool to measure learning and align university and student expectations via reflection and personalisation. ASCILITE 2018 - Conference Proceedings - 35th International Conference of Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education: Open Oceans: Learning Without Borders, 35–44.
Zurück zum Zitat Bouchet, F., Harley, J. M., Trevors, G. J., & Azevedo, R. (2013). Clustering and profiling students according to their interactions with an intelligent tutoring system fostering self-regulated learning. JEDM - Journal of Educational Data Mining, 5(1), 104–146. https://doi.org/10.5281/zenodo.3554613CrossRef Bouchet, F., Harley, J. M., Trevors, G. J., & Azevedo, R. (2013). Clustering and profiling students according to their interactions with an intelligent tutoring system fostering self-regulated learning. JEDM - Journal of Educational Data Mining, 5(1), 104–146. https://​doi.​org/​10.​5281/​zenodo.​3554613CrossRef
Zurück zum Zitat Chen, L., Lu, M., Goda, Y., Shimada, A., & Yamada, M. (2020). Factors of the use of learning analytics dashboard that affect metacognition. 17th International Conference on Cognition and Exploratory Learning in Digital Age, CELDA 2020, November, 295–302. https://doi.org/10.33965/celda2020_202014l038 Chen, L., Lu, M., Goda, Y., Shimada, A., & Yamada, M. (2020). Factors of the use of learning analytics dashboard that affect metacognition. 17th International Conference on Cognition and Exploratory Learning in Digital Age, CELDA 2020, November, 295–302. https://​doi.​org/​10.​33965/​celda2020_​202014l038
Zurück zum Zitat Cody, C., Warren, D., Chi, M., & Barnes, T. (2020). Does autonomy help Help? The impact of unsolicited hints on help avoidance and performance. Proceedings of the 13th International Conference on Educational Data Mining, EDM 2020, Edm. Cody, C., Warren, D., Chi, M., & Barnes, T. (2020). Does autonomy help Help? The impact of unsolicited hints on help avoidance and performance. Proceedings of the 13th International Conference on Educational Data Mining, EDM 2020, Edm.
Zurück zum Zitat Corrin, L., & de Barba, P. (2014). Exploring students’ interpretation of feedback delivered through learning analytics dashboards. Proceedings of ASCILITE 2014 - Annual Conference of the Australian Society for Computers in Tertiary Education, February 2015, 629–633. Corrin, L., & de Barba, P. (2014). Exploring students’ interpretation of feedback delivered through learning analytics dashboards. Proceedings of ASCILITE 2014 - Annual Conference of the Australian Society for Computers in Tertiary Education, February 2015, 629–633.
Zurück zum Zitat Davis, D., Chen, G., Jivet, I., Hauff, C., & Houben, G. J. (2016). Encouraging metacognition and self-regulation in MOOCs through increased learner feedback. CEUR Workshop Proceedings, 1596, 17–22. Davis, D., Chen, G., Jivet, I., Hauff, C., & Houben, G. J. (2016). Encouraging metacognition and self-regulation in MOOCs through increased learner feedback. CEUR Workshop Proceedings, 1596, 17–22.
Zurück zum Zitat Davis, D., Chen, G., Jivet, I., Hauff, C., Kizilcec, R. F., & Houben, G. J. (2017). Follow the successful crowd: Raising MOOC completion rates through social comparison at scale? ACM International Conference Proceeding Series, 454–463.https://doi.org/10.1145/3027385.3027411 Davis, D., Chen, G., Jivet, I., Hauff, C., Kizilcec, R. F., & Houben, G. J. (2017). Follow the successful crowd: Raising MOOC completion rates through social comparison at scale? ACM International Conference Proceeding Series, 454–463.https://​doi.​org/​10.​1145/​3027385.​3027411
Zurück zum Zitat de Freitas, S., Gibson, D. C., du Plessis, C., Halloran, P., Williams, E., Ambrose, M., Dunwell, I., & Arnab, S. (2015). Foundations of dynamic learning analytics: Using university student data to increase retention. British Journal of Educational Technology, 46(6), 1175–1188.CrossRef de Freitas, S., Gibson, D. C., du Plessis, C., Halloran, P., Williams, E., Ambrose, M., Dunwell, I., & Arnab, S. (2015). Foundations of dynamic learning analytics: Using university student data to increase retention. British Journal of Educational Technology, 46(6), 1175–1188.CrossRef
Zurück zum Zitat Domínguez, C., García-Izquierdo, F. J., Jaime, A., Pérez, B., Rubio, Á. L., & Zapata, M. A. (2021). Using process mining to analyze time distribution of self-assessment and formative assessment exercises on an online learning tool. IEEE Transactions on Learning Technologies, 14(5), 709–722. https://doi.org/10.13039/501100011033 Domínguez, C., García-Izquierdo, F. J., Jaime, A., Pérez, B., Rubio, Á. L., & Zapata, M. A. (2021). Using process mining to analyze time distribution of self-assessment and formative assessment exercises on an online learning tool. IEEE Transactions on Learning Technologies, 14(5), 709–722. https://​doi.​org/​10.​13039/​501100011033
Zurück zum Zitat Frey, T. F., Gkotsis, G., & Mikroyannidis, A. (2016). Are you thinking what I’m thinking? Representing Metacognition with Question-based Dialogue Conference Item. CEUR Workshop Proceedings, 51–58. Frey, T. F., Gkotsis, G., & Mikroyannidis, A. (2016). Are you thinking what I’m thinking? Representing Metacognition with Question-based Dialogue Conference Item. CEUR Workshop Proceedings, 51–58.
Zurück zum Zitat Hadwin, A. F., Järvelä, S., & Miller, M. (2011). Self-regulated, co-regulated, and socially shared regulation of learning. In B. J. Zimmerman & D. H. Schunk (Eds.), Handbook of self-regulation of learning and performance (pp. 65–84). Routledge. Hadwin, A. F., Järvelä, S., & Miller, M. (2011). Self-regulated, co-regulated, and socially shared regulation of learning. In B. J. Zimmerman & D. H. Schunk (Eds.), Handbook of self-regulation of learning and performance (pp. 65–84). Routledge.
Zurück zum Zitat Hardebolle, C., Jermann, P., Pinto, F., & Tormey, R. (2020). Impact of a learning analytics dashboard on the practice of students and teachers. SEFI 47th Annual Conference: Varietas Delectat... Complexity Is the New Normality, Proceedings, 1622–1632. Hardebolle, C., Jermann, P., Pinto, F., & Tormey, R. (2020). Impact of a learning analytics dashboard on the practice of students and teachers. SEFI 47th Annual Conference: Varietas Delectat... Complexity Is the New Normality, Proceedings, 1622–1632.
Zurück zum Zitat Herodotou, C., Naydenova, G., Boroowa, A., Gilmour, A., & Rienties, B. (2020). How can predictive learning analytics and motivational interventions increase student retention and enhance administrative support in distance education? Journal of Learning Analytics, 7(2), 72–83. https://doi.org/10.18608/JLA.2020.72.4 Herodotou, C., Naydenova, G., Boroowa, A., Gilmour, A., & Rienties, B. (2020). How can predictive learning analytics and motivational interventions increase student retention and enhance administrative support in distance education? Journal of Learning Analytics, 7(2), 72–83. https://​doi.​org/​10.​18608/​JLA.​2020.​72.​4
Zurück zum Zitat Inventado, P. S., Legaspi, R., & Numao, M. (2013). Helping students manage personalized learning scenarios. Proceedings of the 6th International Conference on Educational Data Mining, EDM 2013, 2–4. Inventado, P. S., Legaspi, R., & Numao, M. (2013). Helping students manage personalized learning scenarios. Proceedings of the 6th International Conference on Educational Data Mining, EDM 2013, 2–4.
Zurück zum Zitat Järvelä, S., Järvenoja, H., & Malmberg, J. (2019). Capturing the dynamic and cyclical nature of regulation: Methodological Progress in understanding socially shared regulation in learning. International Journal of Computer-Supported Collaborative Learning, 14(4). https://doi.org/10.1007/s11412-019-09313-2 Järvelä, S., Järvenoja, H., & Malmberg, J. (2019). Capturing the dynamic and cyclical nature of regulation: Methodological Progress in understanding socially shared regulation in learning. International Journal of Computer-Supported Collaborative Learning, 14(4). https://​doi.​org/​10.​1007/​s11412-019-09313-2
Zurück zum Zitat Jivet, I., Wong, J., Scheffel, M., Valle Torre, M., Specht, M., & Drachsler, H. (2021). Quantum of choice: How learners’ feedback monitoring decisions, goals and self-regulated learning skills are related. ACM International Conference Proceeding Series, 416–427. https://doi.org/10.1145/3448139.3448179 Jivet, I., Wong, J., Scheffel, M., Valle Torre, M., Specht, M., & Drachsler, H. (2021). Quantum of choice: How learners’ feedback monitoring decisions, goals and self-regulated learning skills are related. ACM International Conference Proceeding Series, 416–427. https://​doi.​org/​10.​1145/​3448139.​3448179
Zurück zum Zitat Jivet, I., Specht, M., Scheffel, M., & Drachsler, H. (2018). License to evaluate: Preparing learning analytics dashboards for educational practice Citation for published version (APA). Proceedings of the 8th International Conference on Learning Analytics and Knowledge : (LAK ’18), 32–40. https://doi.org/10.1145/3170358.3170421 Jivet, I., Specht, M., Scheffel, M., & Drachsler, H. (2018). License to evaluate: Preparing learning analytics dashboards for educational practice Citation for published version (APA). Proceedings of the 8th International Conference on Learning Analytics and Knowledge : (LAK ’18), 32–40. https://​doi.​org/​10.​1145/​3170358.​3170421
Zurück zum Zitat Kia, F. S., Teasley, S. D., Hatala, M., Karabenick, S. A., & Kay, M. (2020). How patterns of students dashboard use are related to their achievement and self-regulatory engagement. ACM International Conference Proceeding Series, 340–349. https://doi.org/10.1145/3375462.3375472 Kia, F. S., Teasley, S. D., Hatala, M., Karabenick, S. A., & Kay, M. (2020). How patterns of students dashboard use are related to their achievement and self-regulatory engagement. ACM International Conference Proceeding Series, 340–349. https://​doi.​org/​10.​1145/​3375462.​3375472
Zurück zum Zitat Lallé, S., Conati, C., Azevedo, R., Mudrick, N., & Taub, M. (2017). On the influence on learning of student compliance with prompts fostering self-regulated learning. Proceedings of the 10th International Conference on Educational Data Mining, EDM 2017, 120–127. Lallé, S., Conati, C., Azevedo, R., Mudrick, N., & Taub, M. (2017). On the influence on learning of student compliance with prompts fostering self-regulated learning. Proceedings of the 10th International Conference on Educational Data Mining, EDM 2017, 120–127.
Zurück zum Zitat Li, H., Ogata, H., Tsuchiya, T., Suzuki, Y., Uchida, S., Ohashi, H., & Konomi, S. (2017a). Using learning analytics to support computer-assisted language learning. Proceedings of the 25th International Conference on Computers in Education, ICCE 2017a - Main Conference Proceedings, December, 908–913. Li, H., Ogata, H., Tsuchiya, T., Suzuki, Y., Uchida, S., Ohashi, H., & Konomi, S. (2017a). Using learning analytics to support computer-assisted language learning. Proceedings of the 25th International Conference on Computers in Education, ICCE 2017a - Main Conference Proceedings, December, 908–913.
Zurück zum Zitat Li, I. H., Hwang, G. H., & Lin, Y. X. (2017b). Data mining on the prior knowledge and the effectiveness of the self-regulated learning. 2017b 4th International Conference on Industrial Engineering and Applications, ICIEA 2017b, 275–278. https://doi.org/10.1109/IEA.2017.7939221 Li, I. H., Hwang, G. H., & Lin, Y. X. (2017b). Data mining on the prior knowledge and the effectiveness of the self-regulated learning. 2017b 4th International Conference on Industrial Engineering and Applications, ICIEA 2017b, 275–278. https://​doi.​org/​10.​1109/​IEA.​2017.​7939221
Zurück zum Zitat Lim, L.-A., Dawson, S., Gašević, D., Joksimović, S., Fudge, A., Pardo, A., & Gentili, S. (2020). Students’ sense-making of personalised feedback based on learning analytics. Australasian Journal of Educational Technology, 36(6), 15–33. https://doi.org/10.14742/ajet.6370 Lim, L.-A., Dawson, S., Gašević, D., Joksimović, S., Fudge, A., Pardo, A., & Gentili, S. (2020). Students’ sense-making of personalised feedback based on learning analytics. Australasian Journal of Educational Technology, 36(6), 15–33. https://​doi.​org/​10.​14742/​ajet.​6370
Zurück zum Zitat Lodge, J. M., Panadero, E., Broadbent, J., & de Barba, P. G. (2019). Supporting self-regulated learning with learning analytics. In Lodge, Horvath, & Corrin (Eds.), Learning analytics in the classroom (Issue October, pp. 45–55). Routledge. Lodge, J. M., Panadero, E., Broadbent, J., & de Barba, P. G. (2019). Supporting self-regulated learning with learning analytics. In Lodge, Horvath, & Corrin (Eds.), Learning analytics in the classroom (Issue October, pp. 45–55). Routledge.
Zurück zum Zitat Matcha, W., Gašević, D., Uzir, N. A., Jovanović, J., & Pardo, A. (2019). Analytics of learning strategies: Associations with academic performance and feedback. ACM International Conference Proceeding Series, March, 461–470. https://doi.org/10.1145/3303772.3303787 Matcha, W., Gašević, D., Uzir, N. A., Jovanović, J., & Pardo, A. (2019). Analytics of learning strategies: Associations with academic performance and feedback. ACM International Conference Proceeding Series, March, 461–470. https://​doi.​org/​10.​1145/​3303772.​3303787
Zurück zum Zitat Matcha, W., Uzir, N. A., Gasevic, D., & Pardo, A. (2020). A Systematic Review of Empirical Studies on Learning Analytics Dashboards: A Self-Regulated Learning Perspective. In IEEE Transactions on Learning Technologies (Vol. 13, Issue 2, pp. 226–245). Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/TLT.2019.2916802 Matcha, W., Uzir, N. A., Gasevic, D., & Pardo, A. (2020). A Systematic Review of Empirical Studies on Learning Analytics Dashboards: A Self-Regulated Learning Perspective. In IEEE Transactions on Learning Technologies (Vol. 13, Issue 2, pp. 226–245). Institute of Electrical and Electronics Engineers. https://​doi.​org/​10.​1109/​TLT.​2019.​2916802
Zurück zum Zitat Menchaca, I., Guenaga, M., & Solabarrieta, J. (2018). Learning analytics for formative assessment in engineering education. International Journal of Engineering Education, 34(3), 953–967. Menchaca, I., Guenaga, M., & Solabarrieta, J. (2018). Learning analytics for formative assessment in engineering education. International Journal of Engineering Education, 34(3), 953–967.
Zurück zum Zitat Munshi, A., & Biswas, G. (2019). Personalization in OELEs: Developing a data-driven framework to model and scaffold SRL processes. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11626 LNAI, 354–358. https://doi.org/10.1007/978-3-030-23207-8_65 Munshi, A., & Biswas, G. (2019). Personalization in OELEs: Developing a data-driven framework to model and scaffold SRL processes. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11626 LNAI, 354–358. https://​doi.​org/​10.​1007/​978-3-030-23207-8_​65
Zurück zum Zitat Nikolayeva, I., Yessad, A., Laforge, B., & Luengo, V. (2020). Does an e-mail reminder intervention with learning analytics reduce procrastination in a blended university course? Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). LNCS, 12315, 60–73. https://doi.org/10.1007/978-3-030-57717-9_5CrossRef Nikolayeva, I., Yessad, A., Laforge, B., & Luengo, V. (2020). Does an e-mail reminder intervention with learning analytics reduce procrastination in a blended university course? Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). LNCS, 12315, 60–73. https://​doi.​org/​10.​1007/​978-3-030-57717-9_​5CrossRef
Zurück zum Zitat Page, M. J., Moher, D., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., Mcdonald, S., …, Mckenzie, J. E. (2021). PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews. BMJ, 372. https://doi.org/10.1136/BMJ.N160 Page, M. J., Moher, D., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., Mcdonald, S., …, Mckenzie, J. E. (2021). PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews. BMJ, 372. https://​doi.​org/​10.​1136/​BMJ.​N160
Zurück zum Zitat Pedrotti, M., & Nistor, N. (2019). How Students Fail to Self-regulate Their Online Learning Experience. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11722 LNCS, 377–385. https://doi.org/10.1007/978-3-030-29736-7_28 Pedrotti, M., & Nistor, N. (2019). How Students Fail to Self-regulate Their Online Learning Experience. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11722 LNCS, 377–385. https://​doi.​org/​10.​1007/​978-3-030-29736-7_​28
Zurück zum Zitat Pérez-Álvarez, R., Maldonado-Mahauad, J., & Pérez-Sanagustín, M. (2018). Tools to Support Self-Regulated Learning in Online Environments: Literature Review. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11082 LNCS(January), 16–30. https://doi.org/10.1007/978-3-319-98572-5_2 Pérez-Álvarez, R., Maldonado-Mahauad, J., & Pérez-Sanagustín, M. (2018). Tools to Support Self-Regulated Learning in Online Environments: Literature Review. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11082 LNCS(January), 16–30. https://​doi.​org/​10.​1007/​978-3-319-98572-5_​2
Zurück zum Zitat Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of Self-Regulation (pp. 452–502). Academic Press. Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of Self-Regulation (pp. 452–502). Academic Press.
Zurück zum Zitat Rienties, B., Cross, S., & Zdrahal, Z. (2017). Implementing a learning analytics intervention and evaluation framework: What works? In B. K. Daniel (Ed.), Big data and learning analytics in higher education (pp. 147–166). Springer.CrossRef Rienties, B., Cross, S., & Zdrahal, Z. (2017). Implementing a learning analytics intervention and evaluation framework: What works? In B. K. Daniel (Ed.), Big data and learning analytics in higher education (pp. 147–166). Springer.CrossRef
Zurück zum Zitat Saqr, M., & López-Pernas, S. (2021). Idiographic learning analytics: A single student (N=1) approach using psychological networks. CEUR Workshop Proceedings, 2868. Saqr, M., & López-Pernas, S. (2021). Idiographic learning analytics: A single student (N=1) approach using psychological networks. CEUR Workshop Proceedings, 2868.
Zurück zum Zitat Sedraz Silva, J. C., Zambom, E., Rodrigues, R. L., Ramos, J. L. C., & da Fonseca De Souza, F. (2018). Effects of learning analytics on students’ self-regulated learning in flipped classroom. International Journal of Information and Communication Technology Education, 14(3), 91–107.https://doi.org/10.4018/IJICTE.2018070108 Sedraz Silva, J. C., Zambom, E., Rodrigues, R. L., Ramos, J. L. C., & da Fonseca De Souza, F. (2018). Effects of learning analytics on students’ self-regulated learning in flipped classroom. International Journal of Information and Communication Technology Education, 14(3), 91–107.https://​doi.​org/​10.​4018/​IJICTE.​2018070108
Zurück zum Zitat Törmänen, T., Järvenoja, H., Saqr, M., Malmberg, J., & Järvelä, S. (2022). Affective states and regulation of learning during socio-emotional interactions in secondary school collaborative groups. British Journal of Educational Psychology. https://doi.org/10.1111/BJEP.12525CrossRef Törmänen, T., Järvenoja, H., Saqr, M., Malmberg, J., & Järvelä, S. (2022). Affective states and regulation of learning during socio-emotional interactions in secondary school collaborative groups. British Journal of Educational Psychology. https://​doi.​org/​10.​1111/​BJEP.​12525CrossRef
Zurück zum Zitat Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated engagement in learning. In D. Hacker, J. Dunlosky, & A. Graesser (Eds.), Metacognition in Educational Theory and Practice (pp. 277–304). Erlbaum. Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated engagement in learning. In D. Hacker, J. Dunlosky, & A. Graesser (Eds.), Metacognition in Educational Theory and Practice (pp. 277–304). Erlbaum.
Zurück zum Zitat Winne, P. H., Teng, K., Chang, D., Lin, M. P. C., Marzouk, Z., Nesbit, J. C., Patzak, A., Rakovic, M., Samadi, D., & Vytasek, J. (2019). NStudy: Software for learning analytics about learning processes and self-regulated learning. Journal of Learning Analytics, 6(2), 95–106. https://doi.org/10.18608/jla.2019.62.7 Winne, P. H., Teng, K., Chang, D., Lin, M. P. C., Marzouk, Z., Nesbit, J. C., Patzak, A., Rakovic, M., Samadi, D., & Vytasek, J. (2019). NStudy: Software for learning analytics about learning processes and self-regulated learning. Journal of Learning Analytics, 6(2), 95–106. https://​doi.​org/​10.​18608/​jla.​2019.​62.​7
Zurück zum Zitat Wise, A. F., Vytasek, J., Hausknecht, S., & Zhao, Y. (2016). Developing learning analytics design knowledge in the “middle space”: The student tuning model and align design framework for learning analytics use. Online Learning Journal, 20(2), 155–182. https://doi.org/10.24059/olj.v20i2.783 Wise, A. F., Vytasek, J., Hausknecht, S., & Zhao, Y. (2016). Developing learning analytics design knowledge in the “middle space”: The student tuning model and align design framework for learning analytics use. Online Learning Journal, 20(2), 155–182. https://​doi.​org/​10.​24059/​olj.​v20i2.​783
Zurück zum Zitat Yu, L. C., Lee, C. W., Pan, H. I., Chou, C. Y., Chao, P. Y., Chen, Z. H., Tseng, S. F., Chan, C. L., & Lai, K. R. (2018). Improving early prediction of academic failure using sentiment analysis on self-evaluated comments. Journal of Computer Assisted Learning, 34(4), 358–365. https://doi.org/10.1111/jcal.12247CrossRef Yu, L. C., Lee, C. W., Pan, H. I., Chou, C. Y., Chao, P. Y., Chen, Z. H., Tseng, S. F., Chan, C. L., & Lai, K. R. (2018). Improving early prediction of academic failure using sentiment analysis on self-evaluated comments. Journal of Computer Assisted Learning, 34(4), 358–365. https://​doi.​org/​10.​1111/​jcal.​12247CrossRef
Zurück zum Zitat Zheng, J., Jiang, N., & Dou, J. (2020). Autonomy support and academic stress: A relationship mediated by self-regulated learning and mastery goal orientation. New Waves Educational Research & Development, 23(summer). Zheng, J., Jiang, N., & Dou, J. (2020). Autonomy support and academic stress: A relationship mediated by self-regulated learning and mastery goal orientation. New Waves Educational Research & Development, 23(summer).
Metadaten
Titel
Supporting self-regulated learning with learning analytics interventions – a systematic literature review
verfasst von
Sami Heikkinen
Mohammed Saqr
Jonna Malmberg
Matti Tedre
Publikationsdatum
08.09.2022
Verlag
Springer US
Erschienen in
Education and Information Technologies / Ausgabe 3/2023
Print ISSN: 1360-2357
Elektronische ISSN: 1573-7608
DOI
https://doi.org/10.1007/s10639-022-11281-4

Weitere Artikel der Ausgabe 3/2023

Education and Information Technologies 3/2023 Zur Ausgabe

Premium Partner