Introduction
Related work
Learning analytics intervention
LA intervention methods and evaluation
Evidence-based interventions (EBI)
Experimental approaches | Strengths | Limitations | |
---|---|---|---|
Randomized control trial | Ideal experimental design providing the most reliable evidence | Requires huge amount of effort Risk of inflicting ethical issues | |
A/B testing | Useful to compare impact of different types of interventions on the same objective | Requires certain effort in designing and implementation May raise concerns on ethical issue | |
Comparison with previous implementation | A natural way to compare impact of interventions as most courses tend to repeat with similar instructional settings. This approach has minimal ethical concern | Delayed result Test subjects (students) are different Difficult to preserve the same setting and environment between implementations | |
Quasi-experiment | Easy to implement and evaluate by comparing pre and post intervention. No ethical concern | The result is not as reliable as RCT | |
Switching replications design | An adaptation of Quasi-experiment. Compare results of 2 groups applying the same intervention switching role of experimental and control group | Students’ attitude may already have changed by the time of switching the intervention |
Challenges of LA intervention
LA Intervention systems and tools
Research | Context | ARS identification methods proposed/used | Platform implementation | Intervention/effectiveness evaluation |
---|---|---|---|---|
Tempelaar et al. (2020) | Statistic Course 1027 students | Analyze temporal students' engagement (#Attempt, #Examples, #Hints, #Views, Time spent) by analyzing 3 time-phases after delivered each topic: 1) prepare for tutorial, 2) prepare for quiz, and 3) prepare for exam. Students who exhibited lower engagement in the early phases can be identified as ARS | Not available | Not available |
Foster and Siddle (2019) | Large scale, Open university | Send alert messages to students who had been inactive on the platform every 14 days. The number of alert messages generated by each student served as an effective indicator for identifying at-risk students | Student dashboard | Alert message to student & personal tutor/Number of Alert messages has high prediction on fail students |
Cobos and Ruiz-Garcia (2020) | Web application course, 84 students | Cluster students weekly based on engagements (#Question, #VDO, #Attendance, Time on course) to identify ARS. Additional feature for instructors to record student interventions for tracking | LMS integration (edX) | Email/Student Interview with positive feedback |
Majumda et al. (2019) | Reading courses, 3 universities | Calculate score from student engagements (#Event, Total Time, Complete Rate, Unique Week/Day), classify students to 3 groups. The weaker group is identified as at-risk students | Student, instructor Dashboard, xAPI | Email/No evaluation reported |
Herodutou et al. (2019) | Large scale, Open university | Utilize machine learning methods with demographic and student engagement data (#access to forum, content, resource, wiki, glossary) to identify ARS that likely not submit the next assessment. The result reveals that students under the instructors that used the system gained higher scores | Instructor Dashboard | Upon instructor decision/No evaluation reported |
Şahin and Yurdugül (2019) | Computer Network Course, 79 students | Visualize student’s learning engagement and rank on the dashboard to enhance self-awareness and motivations. No ARS identification. Student interview results indicated the system is useful | Student dashboard LMS plugin (Moodle) | Not available |
Azcona et al. (2019) | CS1 & CS2 course, 266 students | Weekly predict ARS using Machine Learning methods. The prediction features include demographics (Age, Location, School GPA, previous course grade) and weekly engagements (%SolvedQuestion, #Lab work, #Attendance, Time spent, #Material access, Weekday access) | Customized VLE | Weekly message based on student engagement status/Evaluation based on score comparison from summative assessments |
Froissard et al. (2015) | Large scale, Public university | Instructors can customize engagement parameters to identify ARS including. The parameters include assessment activity, forum activity, gradebook and login. The system calculates risk score and display on instructor dashboard | Instructor dashboard LMS plugin (Moodle) | Email/No evaluation reported |
i-Ntervene (This research) | CS2 course, 253 students | Iteratively identify ARS in 2 aspects, learning engagement (#Attendance, #In-class activities, #Assignment, #Supplementary material access) and subject understanding (Assignment score) by analyzing temporal gap between what student performed and instructors’ expectation. The system provides temporal gap visualization in both individual students and class level | Instructor dashboard | Upon instructor decision/Intervention tracking/Systematic evaluation based on improvement of temporal engagement and assessment scores |
i-Ntervene: a platform for instructional intervention in programming courses
Course instruction
Learning deficiency analysis
Activity gaps evaluation
Learning activities | Interaction on LMS | Observing timeframe |
---|---|---|
Class Attendance | A student logs in or joins the VDO conference session | Classroom start time ± grace periods for early and late |
Class Engagement | A student interacts with a learning material (e.g., pop-up question, coding question, and video) in which the instructor introduced in classroom | Within classroom period |
Assignment Practice | A student submits code on questions in which the instructor assigned as homework | Outside classroom period |
Self-Study | A student watches video or submits code on questions in which the instructor did not assign | Outside classroom period |
Activity gaps | Definition and calculation |
---|---|
Class Attendance Gap (Atdn Gap) \(\in\)[-1,0] | Ratio of Student’s class attendance in a cycle ranging from -1 for no attendance to 0 for fully attended \(Atdn Gap= \left(\frac{Number of class a student attended}{Number of class attendance that instructor expected}\right)-1\) |
Class Engagement Gap (Engm Gap) \(\in\)[-1,0] | Ratio of student’s engagement in classroom in a cycle. The ratio ranges from -1 for no interaction on any material to 0 if a student interacted on all material of his attended classes. In case that a student did not attend any class, Engm gap will be defined as ‘not available’ (na) \(Engm Gap= \left(\frac{Number of material a student worked on in class}{Total materials introduced in the classes that student attended }\right)-1\) |
Assignment Practice Gap (Asgm Gap) \(\in\)[-1,0] | Ratio of assignments in which students practiced in a cycle ranging from -1 for no practice to 0 for practiced on all assigned material \(Asgm Gap= \left(\frac{Number of assignment student practiced}{Number of assignment that instructor expected}\right)-1\) |
Self-Study Rate \(\in\)[0,\(\infty ]\) | Amount of non-assignment material in which students interacted in a cycle ranging from 0 for none to any positive value |
Subject understanding level (UL) evaluation
Assessment aspects | Criteria |
---|---|
Syntactic correctness | All commands in code must be correct. Automatic code graders can simply verify this using a compiler |
Semantic/logical correctness | Code’s outputs must agree with the expected solution described by specifications. Most automatic code graders can validate the code outputs using defined test cases |
Efficiency/performance | Code must provide outputs within a specific duration and computer resource. Most automatic code graders can specify CPU time and memory limits to validate this aspect |
Quality/maintainability | This aspect involves coding style, guidelines and best practices which focus on readability and reusability. It is a consideration for advanced programming course. This aspect cannot be judged by automatic code graders |
Plagiarism | Code should not replicate nor closely resemble another student’s code. Some automatic code graders have a feature to check for code authenticity, but it could yield high false positives in cases of simple coding questions |
-
Pass Score: A passing score to consider that a student can successfully resolve the question.
-
Proper Effort: Expected number of attempts and duration that a student should have spent to solve the question.
-
Syntax Error Rate: A ratio of syntax error attempts to total attempts.
-
Syntax Error Limit: An upper limit of syntax error rate to consider syntax struggling.
At-risk student (ARS) identification
Intervention support
Intervention decision
Effectiveness evaluation
Experimentation
Course setting and challenges
challenge
to developing their OO programming competency within only one semester. i-Ntervene implementation
Learning topics | Pass score | Proper effort | Syntax error limit | |
---|---|---|---|---|
Attempts | Duration (min) | |||
Array1D, string | 0.7 | 5 | 5 | 0.8 |
Array2D | 0.7 | 10 | 5 | 0.8 |
Method | 1 | 5 | 5 | 0.8 |
Class programming, class inheritance | 0.7 | 5 | 5 | 0.5 |
Recursion | 1 | 10 | 5 | 0.8 |
Activity gap | Value | Explanation |
---|---|---|
Drop-out threshold | 4 Cycles | Students with no activity on LMS for 4 consecutive weeks were assumed to have already dropped out and excluded from the intervention process |
Moving average | 2 Cycles | Students who had average Activity Gaps of 2 recent weeks less than the thresholds are listed for ARS in that aspect |
Attendance threshold | − 0.2 | |
Assignment threshold | − 0.2 | |
Engagement threshold | − 0.4 | |
Self-study rate | na | No intervention on self-study |
Expected effect size | 0.01 | Students who gain more than 1% improvement of Activity Gap Moving Average during the evaluation period are counted toward the intervention effectiveness |
Expected UL | 0.5 | Students who archived pass score on more than half of their hands-on questions in that cycle are not list for intervention |
Syntax UL threshold | 0.66 | For students who do not meet Expected Understanding Level in that cycle, if 2/3 of their question practices were classified as Syntax Struggled, they are listed for Syntax intervention. Otherwise, they are listed for Logic intervention |
Logic UL threshold | 0.33 |
Instructional intervention execution and evaluation results
Applied Week#/evaluation Week# | Class attendance intervention methods | % Effectiveness (improved gap ARS/total intervened ARS) |
---|---|---|
5/6 | Method 1: Sending email to ARSs to address their absence and encourage to attend the class | 45% (19/42) |
7/9 | Method 2: Sending email to ARSs to address their absence with the previous semester stats evidencing the importance of class attendance | 71% (20/28) |
11/12 | Method 3: Reminding minimum attendance rate to pass the course via social media and classroom announcement | 83% (15/18) |
Applied Week#/evaluation Week# | Assignment practice intervention methods | % Effectiveness (improved gap ARS/total intervened ARS) |
---|---|---|
6/7 | Method 4: Sending email to ARSs to address their assignment missing and encourage to practice more | 23% (14/62) |
7/9 | Method 5: Sending email to ARSs to address their assignment missing with the previous semester stats emphasize the importance of assignment practice | 46% (13/28) |
10/11 | Method 6: Sending email to individual ARSs with customized message addressing their assignment and in-class engagement patterns | 36% (55/152) |
12/13 | Method 7: Reminding minimum assignment practice rate to pass the course via Social Media Space and classroom announcement | 55% (36/65) |
Learning topics Tutorial date, Exam date (duration) | Attended: unattended | Statistical test |
---|---|---|
Array1D 7/1/2022, 29/1/2022 (22 days) | Number of students = 14: 106 Average Scores = 2.93: 2.80 | p = 0.45 |
Array2D 28/1/2022, 29/1/2022 (1 days) | Number of students = 37: 28 Average Scores = 1.51: 1.43 | p = 0.22 |
String 4/2/2022, 12/2/2022 (8 days) | Number of students = 14: 74 Average Scores = 7.86: 6.18 | p = 0.47 |
Class basic 11/2/2022, 12/2/2022 (1 days) | Number of Students = 22: 37 Average Scores = 8.24: 3.81 | p = 0.015* |
Array of objects 25/02/2022, 23/3/2022 (25 days) | Number of Students = 9: 62 Average Scores = 6.66: 2.74 | p = 0.020* |