Finds documents with both search terms in any word order, permitting "n" words as a maximum distance between them. Best choose between 15 and 30 (e.g. NEAR(recruit, professionals, 20)).
Finds documents with the search term in word versions or composites. The asterisk * marks whether you wish them BEFORE, BEHIND, or BEFORE and BEHIND the search term (e.g. lightweight*, *lightweight, *lightweight*).
This article investigates the transformative potential of AI-based adaptive programming education in addressing the digital divide among socially disadvantaged students. It begins by highlighting the critical importance of programming skills in today's digital economy and the disparities in access to quality programming education. The study defines adaptive learning and AI-based adaptive learning, explaining how these technologies can tailor educational content to individual student needs, thereby enhancing learning outcomes and engagement. The research focuses on the implementation of an AI-driven adaptive learning system in a 13-week study involving 122 students from four universities. The experimental group, which received adaptive instruction, demonstrated significantly higher learning gains and engagement levels compared to the control group, which followed a traditional curriculum. The findings underscore the effectiveness of adaptive learning in providing personalized, real-time feedback and customized learning pathways, which are particularly beneficial for students facing socio-economic challenges. The article also discusses the broader implications for educational equity, suggesting that AI-driven adaptive learning can help bridge educational gaps and promote inclusive learning environments. The study concludes with a call for further research to explore the long-term effects and broader applicability of adaptive learning technologies in various educational settings.
AI Generated
This summary of the content was generated with the help of AI.
Abstract
In the context of the digital economy, programming proficiency is an essential competency that promotes upward socio-economic mobility and expands career opportunities. However, students from socially disadvantaged backgrounds often face significant barriers to acquiring these skills, such as limited access to technology and educational resources. This study explores the impact of AI-based adaptive programming education on socially disadvantaged students' learning outcomes and engagement levels. The 122 participants in the research were divided into an experimental group (EG) that received AI-driven adaptive instruction, and a control group (CG) taught through the traditional curriculum during the 13-week experimental period. In the research, the combined pre-test/post-test assessments and self-report engagement questionnaires were used to focus on behavioural, emotional, and cognitive engagement. The findings showed that the EG demonstrated higher levels in programming knowledge and full-scale engagement across behavioural, emotional, and cognitive dimensions compared to the CG. This difference was confirmed through ANOVA, while ANCOVA, which controlled for students' socio-economic factors, further confirmed that the AI system had a positive impact on students’ results regardless of their socio-economic background. Indeed, current research findings suggest that AI-based adaptive learning environments can hold a promising opportunity to narrow educational gaps, boost engagement, and contribute toward better learning outcomes for socially disadvantaged students.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Introduction
In the digital world today, proficiency in programming has become a critical competency, providing individuals with competence in the job market, technological innovation, and participation in the global digital economy. As a result, programming has become one of the priority skills essential for digital literacy and technical competence, with its demand escalating across diverse sectors reliant on technology-driven solutions. However, access to this area is not evenly distributed. Most students, in particular those from socially deprived backgrounds, face numerous challenges that hinder their chances of accessing quality education. Some limitations include computer access, stable Internet connectivity, and sufficient educational support. These can leave students at a disadvantage, increasing the digital divide.
This study aims to fulfil this challenge by exploring how AI adaptive programming instruction can fill educational deficits of socially disadvantaged students. AI adaptive learning environments can have the capability of adapting learning processes through real-time feedback, along with adaptive content that is tailored to students’ unique learning needs.
Advertisement
To provide clear context to this inquiry, the following sections define adaptive learning and AI-based adaptive learning, explaining how they apply to programming instruction and how they can contribute to educational equity.
Defining Adaptive Learning
Adaptive learning is an instructional approach that automatically modifies educational content, pace, and feedback to accommodate individual students’ needs, preferences, and performance levels (Taylor et al., 2021). Adaptive learning environments modify instruction through learner data to fill skill gaps, ensure engagement, and enhance learning outcomes.
Traditional adaptive learning systems predominantly employed static rules and pre-set pathways, most often informed by historical measures of performance. Such systems detect trends of learner activity and adjust levels of difficulty accordingly (Jing et al., 2023). For instance, where students demonstrate competence on a particular subject, follow-on activities are adjusted to challenge them on topics of greater difficulty. Students who struggle with fundamentals have targeted practice activities and remediation to reinforce understanding.
Research has consistently shown that adaptive learning raises students'motivation and performance level in various fields of study (Xiaoyu & Tobias, 2023). Adaptive learning software that provides students with scaffold practice and feedback has been shown to make significant contributions to levels of skill acquisition and retention of programming courses (Taylor et al., 2021).
Advertisement
Defining AI-based Adaptive Learning
AI-based adaptive learning is far superior to classical adaptive systems. The classical adaptive systems use pre-set rules and static decision trees, while AI-based adaptive environments use predictive analytics combined with machine learning algorithms that automatically modify learning processes based on real-time learner activity (Luckin & Holmes, 2016). Algorithms analyse task completion times, error patterns, and engagement metrics to continuously refine and update instructional content.
In the adaptive learning context of programming instruction, AI-driven adaptive learning systems can give adaptive instruction that is tailored to the cognitive profiles of learners, context-based hints, or tailored code challenges (Heffernan et al., 2016; Lee & Yeo, 2022). For example, if a student is struggling to learn through recursion, the AI system can introduce visual representations, simpler tasks, or practice tasks to consolidate cognitive understanding. In addition, these systems provide instant feedback, which is crucial for facilitating self-regulated learning and maintaining interest levels (Baker & Siemens, 2014; Fernandes et al., 2023; Troussas et al., 2019).
Recent advancements in adaptive learning technology have significantly influenced educational practice by leveraging AI and machine learning algorithms to personalize instruction. Adaptive learning systems adjust content, pace, and feedback automatically based on students'performance and learning needs, increasing engagement and learning outcomes (Strielkowski et al., 2024). The 2023 EDUCAUSE Horizon Report lists adaptive learning as one of the postsecondary emerging trends, including STEM fields like programming instruction. AI-based adaptive learning software creates learner-driven pathways, automates routine tasks, and generates predictive feedback to steer early interventions among students, including disadvantaged students (Pelletier et al., 2023). Adaptive new systems break up old practices by using real-time data analytics combined with natural language processing to predict learning pathways and adjust educational content accordingly (Sari et al., 2024). Adaptive personalization through this process ensures that varied learning needs are attended to, facilitating inclusive access to programming competency and educational inclusivity.
This personalized, AI-driven process is most valuable to socially disadvantaged students, who may have no means of accessing ancillary educational resources or private tutors. By customizing content to meet every learner's diversified needs, AI-based adaptive learning programs can bridge learning deficits and provide equal access to programming instruction.
The Role of AI-based Adaptive Learning in Programming Education
AI-based adaptive learning can also redefine programming teaching by adapting to individual learning needs through instruction tailored to each learner. Such software applies machine learning-based algorithms to adapt the curriculum and feedback accordingly based on each learner's performance, skill levels, and learning habits (Colliot et al., 2024; Demartini et al., 2024; Seo et al., 2021). By providing instant, tailored feedback for each learner, the program can foster a more inclusive learning environment, particularly for disadvantaged students where mainstream educational models fall short (Kuzenkov, 2024; Holmes et al., 2019; Duraes et al., 2024).
Research Focus and Objectives
This study aims to determine how AI-based adaptive instruction can enhance socially disadvantaged students'achievement and engagement levels. The objective is to determine if instruction using AI can provide equal access to basic digital competence, consequently, social mobility.
By measuring students' levels of achievement through AI-based adaptive learning environments compared to those students who enrolled in non-adaptive, static courses, it is possible to assess the extent to which adaptive learning systems can correct educational inequities. The results offer valuable insights into how educational technology can promote inclusivity, innovation, and social equity in today's digital age.
Education Background for the Socially Disadvantaged
Modern educational inequalities have become an urgent concern in the context of digital transformation for students all around the world. In practice, students from diverse disadvantaged backgrounds face numerous barriers, for example, the lack of technology at home, limited resources in education, and a general lack of support hindering their possibilities of acquiring significant skills, such as programming. Therefore, it is important to find a solution to such inequalities by introducing innovative technologies that would face the challenges and provide balanced learning environments.
AI can be one of the most promising solutions for the current problem, as it provides adaptive learning experiences tailored to individual needs. Research proves that these technologies have the potential to close the educational divide by offering content and support directly aligned with the needs of disadvantaged students. Regarding this goal, the relevant literature was reviewed. The Scopus database was utilized as a map for the relevant studies included in the literature background. The query was as follows:
(TITLE-ABS-KEY(education) AND TITLE-ABS-KEY (socially AND disadvantaged)) AND (LIMIT-TO (SUBJAREA, "COMP")) AND (LIMIT-TO (DOCTYPE, "ar") OR LIMIT-TO (DOCTYPE, "cp"))
The articles collected by the research were reviewed. The most relevant articles with a higher impact (ranked by number of citations) were sorted, and the main findings highlighted.
Technology and Educational Equity
Educational inequality is also a worldwide challenge that continues, particularly among students of disadvantaged socio-economic backgrounds. Availability of facilities, technology, and educational resources is critical to bridging the gaps. Preparedness of pre-primary schoolteachers to employ information and communication technologies (ICTs) within rural Tanzania. The conclusion reached was that while there were perceptions of ICT educational benefits by instructors, there were limitations of poor facilities, full classrooms, and resources that restricted effective use. Despite various challenges, the authors concluded that ICTs had the potential to significantly increase learning outcomes, particularly where traditional resources were limited. The finding reveals that investing in facilities and educator preparation is critical to unlocking ICT's full potential of enhancing educational equity (Tandika & Ndijuye, 2019).
Similarly, interactive musical technology can advance learning in disadvantaged areas of Mexico. With socially unprivileged students of Tumbisca, musical software on mobile phones promoted creativity and social integration of students. Real-time audio modification engaged students and encouraged critical thinking. The findings of this research highlight how affordable interactive technology can advance educational outcomes of disadvantaged groups, evidence that technology-based, culture-specific interventions can advance educational inequities on a systematic scale (Duarte-García & Sigal-Sefchovich, 2019).
The studies complement broader global evidence on the digital divide. The digital inequality of the EU27 and the UK, concluding that educational levels have sizeable impacts on internet use, skill acquisition, and overall digital use. The researchers concluded that people of higher educational levels reap most of the digital benefits of innovation, but lower levels of educational achievement also have better levels of digital access within digitally advanced countries. It suggests that educational policies that ensure fair educational access with digital competency are important to avoid the digital divide, promoting inclusive educational practice (Lamberti et al., 2021).
The Role of IT Skills in Social Mobility
Programming and ICT skills have also been critical to making people participate in the digital economy. The literature is unequivocally clear that digital competence is linked to better job opportunities and wage increases. Atasoy et al. explored how ICT competence affects labour market outcomes within an emerging market country, discovering that ICT competence had a considerable positive effect on job probability. Nevertheless, the study indicated a strong correlation between advanced levels of IT competence and improved employment prospects, including higher salaries and better job opportunities. Notably, the authors of the study identified that there is a need to have targeted training schemes that can equip disadvantaged groups, including women and aging employees, with fundamental IT competence (Atasoy et al., 2021).
These findings reflect that fair availability of IT instruction can enhance social mobility by offering students marketable workforce skills. This is particularly relevant in programming instruction, where unequal availability to educational resources can result in long-term career disparities. Programs that ensure an adaptive, custom-tailored learning experience are critical to ensuring that students of every background have the chance to acquire these skills.
Culturally Relevant and Inclusive Education
The integration of culturally responsive pedagogy has been recognized as essential for effectively engaging disadvantaged students. The intersection of Indigenous Knowledge (IK) and western science within educational institutions across the African continent, revealing how colonial legacies have contributed to the systematic exclusion of native epistemologies. The Science and Indigenous Knowledge Systems Project (SIKSP) attempted to address this imbalance by the preparation of pre-service teachers to integrate scientific principles of the west with native perspectives in their teaching practice. The study concluded that combined strategies promoted learner engagement, critical reasoning, and achievement (Ogunniyi, 2011).
The importance of prioritizing content that is contextually appropriate and responsive to learners'environments. Their evidence indicated that introducing interactive musical systems through contextually suitable sounds significantly enhanced students'motivation and cognitive interest. The findings support that students'cultural contexts must be integrated into educational practice, especially when new technologies are introduced (Duarte-García & Sigal-Sefchovich, 2019).
AI-based Adaptive Learning and Engagement
AI can provide adaptive learning environments that can fulfil specialized learning needs. Adaptive systems monitor students'performance through algorithms that adjust content, pace, and feedback to personalize the learning experience. Educational data mining and learning analytics as domains that can contribute to building adaptive systems, predictive modelling of learning pathways, and early interventions (Baker & Siemens, 2014).
AI can personalize learning and make it inclusive. AI can supplement instructional duties by automating routine tasks, thereby saving time to focus on other pedagogical activities. Adaptive learning software can therefore provide students of varied needs, particularly disadvantaged students, with tailor-made support that is suitable to each of their distinct situations (Luckin & Holmes, 2016).
The EDUCAUSE Horizon Report of 2023 also identified adaptive learning as a significant method in education, including STEM disciplines like programming. Adaptive learning by such programs using real-time analysis and natural language processes makes instruction adaptive to every learner, enhancing levels of engagement and learning outcomes (Pelletier et al., 2023). The capability of AI-based programs to personalize learning experience makes them most appropriate to address socially disadvantaged students'learning issues.
The Importance of Programming Skills for Upward Mobility and Job Opportunities
In the twenty-first century, programming has become an integral component of digital competence, emerging as one of the most prized skills for individuals aiming for upward social mobility and economic advancement. Business digitization has developed programming skills into one of the most important requisites within most business areas, ranging from information technology to health care and financial institutions (Di Battista et al., 2023). Not only do programming skills make individuals competent to use emerging technology, but they also offer opportunities to professions with better career mobility options, job protection, and better pay scales (Gomes et al., 2022).
The demand for programming professionals has expanded exponentially, driven by the increasing automation, AI integration, and digitalization of business processes across various disciplines. The World Economic Forum’s Future of Jobs Report of 2023 recognized programming and software development as essential skills in the future. The report estimates that individuals skilled in programming have better job placements and, on average, 20–30% higher salaries compared to those without programming skills. It also recognises that programming skills contribute significantly to social and economic mobility, particularly of individuals from disadvantaged groups (Di Battista et al., 2023).
Beyond the immediate monetary incentive of the job market, programming instruction also creates important 21 st-century skill areas of logical reasoning, computational thinking, and problem-solving (Tariq, 2025). Such cognitive attributes can have extended areas of life, enabling people to better adapt to technological changes and make meaningful contributions to various activities. For poor children, programming can break poverty traps inherited through generations by providing means of well-paid jobs and greater socio-economic mobility (UNESCO, 2021).
However, access is unequal where disadvantaged students face systematic barriers of poor availability of resources, digital infrastructure, and educational resources (Inegbedion, 2021). The inequity requires innovative, tech-enabled educational interventions, including AI-based adaptive learning, to bridge the gap and make programming skills universally accessible to students.
In response to this demand, this research examines how AI-based adaptive programming instruction can enhance socially disadvantaged students'learning outcomes and engagement. With its adaptive learning pathways and instant feedback, AI-based programs have the potential to democratize programming instruction, thereby fostering inclusion in the digital economy and enhancing social mobility.
Research Gap and Objectives
The reviewed literature indicates that well-designed technology can enhance educational results and motivate disadvantaged students. Previous research has demonstrated that digital resources, culturally relevant learning materials, and adaptive learning approaches can remedy educational inequities. However, there is a lack of systematic evidence examining how AI-based adaptive learning environments enhance programming instruction for socially disadvantaged students.
The objective of this research is to examine how AI-powered adaptive learning can influence programming skills among disadvantaged students, with a focus on their levels of engagement and how AI technologies contribute to educational equity in the digital age. The learning results of students exposed to adaptive instruction are compared with the results of students following a non-adaptive curriculum in the study.
Research Questions
Based on the research directions identified in the reviewed literature, a research objective was formulated to explore whether a correlation exists between socially disadvantaged students who engage with AI-supported, adaptively designed study materials and syllabi and those who follow a uniformly applied, non-adaptive curriculum. The main general research questions were formed as follows:
RQ1 Does AI-based adaptive programming education influence the learning outcomes for socially disadvantaged students compared to traditional, non-adaptive educational methods?
RQ2 To what extent does the delivery of adaptive programming education using AI technology affect student engagement, compared to traditional methods, in terms of behavioural, emotional, and cognitive dimensions?
Materials and Methods
This study employs a mixed-methods approach to examine the effects of AI-based adaptive programming education among socially disadvantaged students. It specifically examines whether an AI-driven adaptive learning environment can help bridge the educational gap in programming skills for unprivileged students, compared to traditional, non-adaptive teaching methods. The experiment was conducted over 13 weeks, during which the quantitative impact on student performance was measured, while qualitative experiences of the participants were concurrently assessed.
The participants were divided into two groups: an experimental group (EG) that was given AI-driven personalized instructions and a control group (CG) taught by a standardized curriculum in programming. Both groups were taught the same key concepts of programming. Data collection was carried out via a mixture of pre-test/post-test assessments, weekly exercises, and student feedback. The objective of the analysis was to examine whether adaptive programming education could result in improved learning and engagement levels.
Social Condition Index (SCI) and its Application
The Social Condition Index \(\left(SCI\right)\) has been utilized here to determine socially disadvantaged students on socio-economic terms. The \(SCI\) is a numerical indicator of social disadvantage that identifies key attributes of a learner that can contribute to educational success. The use of this approach is informed by evidence demonstrating how socio-economic status can impact educational success, particularly within technology-based learning contexts. (Labudova & Fodranova, 2024; Njeri & Taym, 2024; Volchik et al., 2018).
The SCI Framework
The responses to a 15-item Likert-scale questionnaire, Appendix 1, assessing factors of family background, access to resources for education, and learning difficulties, were used to calculate the \(SCI\):
\(T\) is the total score on the questionnaire (the sum of the points of all answers (a score (1 to 5) was assigned to each response, with 1 indicating the most difficult situation and 5 the most favourable),
\(N\) is the number of questions (in this case, 15 questions),
\(SCI\) is the average score obtained from the total score: \(SCI=\frac TN.\)
An \(SCI\) of less than 3.00 implies that the learner is socially disadvantaged. The use of a 3.00 cut point is supported by previous evidence that students within this range of values have difficulty accessing digital learning resources (Munir et al., 2023; Rayevnyeva et al., 2024).
Participants
A total of 122 students from four universities in Hungary participated in the study. Participation criteria included digital literacy and basic programming skills, so all participants had a minimum knowledge of programming concepts before the experiment. Figure 1 shows the gender distribution of participants.
The students between 18 to 21 years old (M = 18.67, SD = 0.93) were selected by their socio-economic background, measured by \(SCI\). Out of the total 289 completed questionnaires, 122 students, that is 42.21%, were identified as being at a social disadvantage. These subjects randomly formed two groups: an EG for AI-based adaptive programming and a CG for the traditional curriculum, with 61 in each group.
Experimental Design
This research was based on the controlled variables in which subjects were assigned to an experimental group (EG), N = 61, or to a control group (CG), N = 61. Students within the EG received an AI-based adaptive programming course, where the curriculum and learning materials would dynamically adjust for each student. By applying machine learning algorithms, the AI system gave the students real-time feedback on their coding exercises and recommended learning paths for them. By contrast, the teaching material and curriculum were uniform for the CG, and learning was more teacher-directed. Both groups attended two programming sessions per week for 13 consecutive weeks, covering the same basic concepts of programming: structures of flow control, functions, loops, and algorithms. The only distinction was that the EG's learning environment was supported by adaptive features, facilitated through the use of ChatGPT.
Controlled Variables and Conditions
In the experiment, there were systematically controlled factors that were monitored to suppress possible sources of bias that might have impacted learning outcomes. A key among these factors was participants'baseline programming experience. For comparability of groups, there were pre-tests taken by each of the participants on programming fundamentals that were compared to ensure that only individuals with identical scores were accepted into the experiment to suppress differences that might have arisen from previous experience.
Another controlled variable was students'socio-economic status, which was evaluated using the SCI. The SCI allowed objective classification of students into socially disadvantaged or non-disadvantaged students. For socially disadvantaged students, there were the same SCI scores within the EG group and the CG group.
Access to technology was also standardized to ensure equivalence between the groups. The same software, same hardware, and availability of Internet were accessed by every participant throughout the 13-week session of intervention. Sessions for both groups were conducted in the same environment to control for external factors such as noise or lighting differences that could potentially affect the outcomes. The same computer laboratories were accessed by both groups to make sure that the conditions of learning were the same.
The instructor of the courses remained the same for each group throughout the study. The instruction content, activities, and curriculum were also the same, except that an AI adaptive mode of instruction was implemented within the EG. The teaching style or instruction content can have a subconscious effect on outcomes; hence, keeping the instructor’s constant presence served to control the effect of adaptive learning instruction.
Finally, the test processes were also controlled within each group. Pre-test and post-test were administered simultaneously under the same conditions, where everything ranging from the timing to the instruction and duration was the same for each subject. By controlling for these variables, the study aimed to eliminate potential confounding effects on programming learning results that could be mistakenly attributed to the AI-based adaptive learning environment. It is presumed that these measures enhanced the internal validity of the study, allowing observed differences between groups to be primarily attributed to the adaptive learning environment.
Learning Materials
The learning materials were designed to accommodate the distinct instructional approaches of the EGs and CGs. Both groups were introduced to the same programming concepts, including syntax, control structures, and algorithms, but the mode of content delivery and the feedback mechanisms varied between the groups.
AI-based Adaptive Learning (EG)
The AI-based adaptive learning system applied during this study had been developed to personalize the learning process through automatically adapting content, task difficulty, and feedback based on students'performance. The system had been developed using ChatGPT and adaptive learning software that monitored students'interactions, task durations, and trends of performance throughout the 13-week session.
The system followed a cycle of continuous data gathering, performance analysis, and adaptive response. The data gathering entailed monitoring task times of completion, correctness, trends of error, and measures of engagement. Student performance was analyzed using machine learning-based techniques, e. g. Bayesian Knowledge Tracing to predict levels of mastery, and clustering algorithms to identify frequent areas of difficulty. The content, the difficulty of the task, and feedback were adapted based on these analyses.
For instance, after finding frequent loops of comprehension mistakes, the system responded by making things simpler, providing interactive explanations, and providing practice tasks in a step-by-step format. The better performers were also provided tasks of increasing difficulty, but of critical reasoning and analytical skill to challenge engagement. The feedback upon submission of tasks was instant, including explanations of mistakes along with targeted ideas on how to improve skills. Instant feedback was applied to reinforce programming concepts and to ensure that learning tasks remained cognitively engaging.
The system also monitored engagement through monitoring inactivity, task abandonments, and task completion trends. When there were signs of disengagement, motivational prompts or nudges were sent by the system or presented to engage the learner once again. The learning pathways were also dynamically adjusted to ensure that students who were constantly underperforming were offered extra practice, while advanced students were led through increasingly complex content.
The adaptive process, supported by natural language processing (NLP) and educational data mining (EDM), enabled the system to predict learning issues and adjust instruction without immediate instructor input. The result was a well-tuned, responsive, and effective learning experience that targeted students'different learning needs, particularly benefiting disadvantaged learners through targeted instruction that was provided by the system.
Traditional Learning (CG)
To provide a clear basis for comparison, the CG followed a traditional programming curriculum without adaptive elements. The instructional content was identical to that of the EG, covering programming fundamentals such as variables, loops, functions, and algorithms.
The CG applied static digital and printed resources that were non-adaptive to students'learning routines. The teaching was offered through lecturing through pre-set exercises, and diagrams. Students had the same weekly tasks as the EG, but they were offered hand-written feedback by the instructor within follow-up lessons, typically one week after task submission. Feedback here was non-adaptive and did not entail specific recommendations or task modification like that of the adaptive group.
The learning environment itself remained the same for each group, utilizing identical laboratories, identical hardware, and software configurations to remove external influences. The same pre- and post-tests were administered by each group to determine learning achievement.
Data Collection Methods
In the effect analysis of AI-based adaptive programming education, both quantitative and qualitative data collection were realized.
Quantitative Data
Pre-test and post-test assessments of programming knowledge were administered to all students both before and following the examination. In addition, weekly assignments and exercises were utilized to monitor student progress over time.
Qualitative Data
The engagement questionnaire utilized throughout this study was adapted to measure socially disadvantaged students'levels of engagement in programming education. The questionnaire content and format were created through well-established students'engagement models (Fredricks et al., 2004; Henrie et al., 2015). The emphasis of these engagement models is on engagement multi-dimensionality, including cognitive, emotional, and behavioural dimensions.
The questionnaire consisted of 12 items that belonged to three domains of engagement: cognitive engagement, emotional engagement, and behavioural engagement. All items were scored on a Likert scale of 1 to 5, ranging from"Strongly Disagree"to"Strongly Agree."The items were also tailored to fit the context of AI-based programming, where the focus is on how students use adaptive systems, how they participate in learning activities, and how they respond emotionally to learning.
The development of the questionnaire followed a multi-step process of validation. Content validity was first confirmed by consulting three educational psychology and programming instruction experts, who screened items on clarity and relevance. Later, construct validity was evaluated using exploratory factor analysis (EFA), which confirmed the construct of three factors, where factor loading ranged from 0.64 to 0.87. The variance of 74% by the EFA supported that there is good correspondence of items to the engagement construct of interest.
Reliability was also validated through the use of Cronbach's alpha, where there were good levels of internal consistency on every one of the subscales. The reliability measures were behavioural engagement (α = 0.85), emotional engagement (α = 0.88), and cognitive engagement (α = 0.86), overall reliability of α = 0.89. All of these were well above the generally accepted minimum of 0.70 that is appropriate for educational measures (Taber, 2018).
This questionnaire, underpinned by well-established engagement models supported by good psychometric properties, is valid and reliable to use to assess levels of engagement within AI-based adaptive programming instruction among socially disadvantaged students.
Data Storage and Management
All data collected in this study, including performance metrics, system logs, and questionnaire responses, were securely stored in a password-protected database. Only authorized research personnel had access to this data, ensuring compliance with ethical standards regarding privacy and confidentiality.
Ethical Considerations
Before data collection began, participants had been informed about the purpose of the study, the types of data that would be collected, and how the gained information would be used. Written informed consent was obtained from all delegates, and the students were assured that their participation was voluntary and could be withdrawn at any time without penalty. The collected data were anonymized to protect the identity of the students. Both quantitative and qualitative data were collected during the study to investigate not only the measurable outcomes of AI-based adaptive programming education but also nuanced experiences and perspectives from socially disadvantaged students. All participants gave their consent to participate in the study; their only request was not to mention the names of the universities where they were currently studying; therefore, that information is not included in the study.
Data Analysis
A combination of quantitative and qualitative methods was employed to assess the impact of AI-based adaptive programming on socially disadvantaged students. This present analysis examined whether learning outcomes and engagement levels between the EG, which received AI-driven and adaptive instructions, differ from those of the CG instructed by traditional curricula. The pre-test and post-test scores, and engagement scores were normally distributed to confirm appropriateness according to the Shapiro–Wilk test for normality, hence justifying the appropriateness of using parametric tests such as the t-tests and ANOVA. Within-group learning gains were evaluated using paired t-tests, while independent t-tests were conducted to compare post-test scores and engagement levels between the groups. In addition, ANOVA was applied to assess overall differences between the groups, and ANCOVA was used to control for the potential confounding effect of socio-economic background by including it as a covariate in the analysis.
Results
This section presents findings of the research, examining how AI-based adaptive instruction is affecting achievement outcomes and engagement levels of socially disadvantaged students. The findings contrast differences in achievement of the EG that were instructed through AI-based adaptive instruction with the CG that were taught by non-adaptive instruction. The findings are presented through streamlined tests of statistics, combined tables, and invariant interpretation of effect size to make the narrative easy to read.
Pre-Test and Post-Test Performance
The pre-test and post-test of every group were compared to establish learning gains. Pre-test evaluated the programming background of the participants, while the post-test, after 13 weeks, evaluated learning outcomes.
The descriptive pre-test and post-test measures with estimated improvement are shown in Table 1.
Table 1
Pre-test and post-test results for EG and CG
Group
Pre-test
Post-test
Mean
SD
Min
Max
Mean
SD
Min
Max
EG*
5.10
1.45
2.50
8.00
7.10
1.40
4.00
9.50
CG*
5.00
1.50
2.00
8.50
6.00
1.55
3.00
9.00
*N = 61
The results show that each group began with about the same levels of programming skills, suggesting that random assignment had successfully controlled baseline skill levels. The post-test results reflect that while both groups demonstrated learning gains, the EG showed significantly higher improvement in the learning results.
Figure 2a displays the mean scores of the CG and EG on pre-test and post-test administrations. For the CG, the mean score went up by 1 point, from 5.0 on pre-test to 6.0 on post-test. The standard deviations for the CG were 1.5 on the pre-test and 1.55 on the post-test, indicating moderate variance that remained relatively consistent across both assessments. For the EG, the mean score had greater increases, up by 2 points, from pre-test 5.1 to post-test 7.1. The pre-test standard deviation of the EG was 1.45, indicating relatively lower variance among test-takers'scores, but it went up by only 1 point to 1.40 on the post-test. The result is that, although there were high increases overall, variance remained constant on each test administration. The chart also contains markers on each line that reflect the midpoint of each trend, indicating the degree of change. The CG had + 1.00, whereas the EG had greater + 2.00. The use of error bars also helps to contextualize that the within-group variance of CG remained constant, whereas that of the EG dropped by only very little as test-takers'scores went up. Overall, this chart reveals that each group showed improvement on the 13-week instruction, but the EG group with AI-based adaptive programming instruction showed greater change compared to the results of the CG.
Fig. 2
a The mean scores of the CG and EG in the pre-test and post-test phases. b A visualization of score distributions for both the CG and EG in the pre-test and post-test
Figure 2b shows the test score distribution of the pre-test and post-test of the CG and EG. The pre-test of CG ranges from 2.0 to 8.5, with a mean of 5.0. The narrow range of its interquartile range implies moderate group variability of its pre-test performance. The post-test of CG ranges from 3.0 to 9.0, its mean rises to around 6.0. The post-test range of its interquartile range is also very similar to that of the pre-test, meaning that its post-test test score range is only slightly greater but of similar variability. The pre-test of EG ranges from 2.5 to 8.0, its mean is 5.1. The narrow range of its interquartile range implies relatively lower group variability, meaning that most of its students had very similar programming background knowledge. After receiving its intervention, its post-test ranges from 4.0 to 9.5, and its mean rises to 7.1. The post-test range of its interquartile range is much broader, meaning greater variability, which suggests that many students in the EG demonstrated significant improvement after receiving adaptive instruction. Overall, the boxplot indicates that both groups experienced learning gains from pre-test to post-test, but there is greater improvement indicated by EG, as its median is greater, its range of test scores broader, meaning that adaptive instruction using AI generated greater variance and greater increases in performance compared to using non-adaptive instruction in the CG.
Table 2 assesses whether the data followed a normal distribution, ensuring the appropriateness of parametric tests.
Table 2
Shapiro–Wilk test of normality for pre-test and post-test scores
Pre-test
Post-test
Group
W-value
p-value
W-value
p-value
EG*
0.980
0.579**
0.971
0.119**
CG*
0.992
0.788**
0.982
0.208**
*N = 61
**For all groups, the p-values from the Shapiro–Wilk test are greater than 0.05, indicating that the data do not significantly deviate from normality
Table 3 compares pre-test and post-test scores for each group. Along with the t-values and p-values, Cohen’s d was calculated to determine the effect size of the intervention in both groups, providing insights into the magnitude of the observed improvements.
Table 3
Results of paired t-tests within groups with effect size (Cohen's d)
Group
Pre-test
Post-test
Mean
SD
t
p-value
d
Mean
SD
t
p-value
d
EG*
5.10
1.45
3.97
< 0.001**
1.40***
7.10
1.40
6.21
< 0.001**
0.74****
CG*
5.00
1.50
< 0.001**
6.00
1.55
< 0.001**
*N = 61
**2-tailed
***A large effect
****A medium effect
The students in the EG learned much better (p < 0.001) with a large effect size of 1.40, while the CG learned moderately better (p < 0.001) but had only a moderate effect size (d = 0.74). The result suggests that the AI-based adaptive learning environment had a statistically significant influence on the acquisition of learning.
Table 4 is EG post-test versus CG post-test comparative measures of relative instruction effectiveness.
Table 4
Results of independent t-tests between groups with effect size (Cohen's d) in case of post-test scores
Comparison
Mean
t
p-value
d
EG*
CG*
Post-test
7.10
6.00
4.57
< 0.001**
−0.88***
*N = 61
**2-tailed
***A large effect
The post-test values were also significantly higher in the EG compared to those of the CG (p < 0.001), suggesting a very large effect size (d = −0.88). The finding implies that adaptive learning had a measurable edge compared to the traditional teaching practice.
Engagement Levels
Engagement levels were assessed using a self-report questionnaire. The normality test results are displayed in Table 5.
Table 5
Shapiro–Wilk test of normality
Engagement dimension
Group
W-value
p-value
Behavioural
EG*
0.971
0.167**
CG*
0.982
0.214**
Emotional
EG*
0.970
0.125**
CG*
0.983
0.221**
Cognitive
EG*
0.969
0.110**
CG*
0.980
0.190**
*N = 61
**For all groups, the p-values from the Shapiro–Wilk test are greater than 0.05, indicating that the data do not significantly deviate from normality
All engagement domains'data were distributed normally, hence allowing the use of parametric tests of statistical tests. Table 6 displays the engagement levels of the two groups. For the dimension of engagement on behaviour, a mean of 16.2 with SD of 1.4 by the EG compared to that of 13.0 with SD of 2.0 by the CG. The result of the t-test of 5.34 had a p-value of < 0.001, meaning that there is a statistically significant difference. The effect size of 1.10 is representative of a very big effect size. For emotional engagement, the mean of 15.8 by the EG had an SD of 1.6, while that of 12.5 by the CG had an SD of 1.9. The result of the t-test of 4.87 had a p-value of < 0.001, meaning that there is a statistically significant difference. The effect size of 1.00 is also representative of a big effect size. For cognitive engagement, the mean of 17.0 by the EG had an SD of 1.2, while that of 13.9 by the CG had an SD of 2.1. The result of the t-test of 6.10 had a p-value of < 0.001, meaning that there is a statistically significant difference, but its effect size of 1.40 is representative of a big and meaningful effect size.
Table 6
Results of independent t-tests engagement dimensions with effect size (Cohen's d)
Engagement dimension
Group
Mean
SD
t
p-value
d
Behavioural
EG*
16.2
1.4
5.34
< 0.001**
1.10***
CG*
13.0
2.0
Emotional
EG*
15.8
1.6
4.87
< 0.001**
1.00***
CG*
12.5
1.9
Cognitive
EG*
17.0
1.2
6.10
< 0.001**
1.40***
CG*
13.9
2.1
*N = 61
**2-tailed
***A large effect
Figure 3 is a graph of mean levels of engagement on cognitive, emotional, and behavioural CG and EG scales. The graph is evidence of higher levels of engagement of the EG on each of the scales, suggesting that there were significant positive effects of adaptive learning on students'levels of engagement.
The overall achievement of learning was compared by employing ANOVA and ANCOVA to adjust for potential socio-economic impacts. Table 7 demonstrates that EG outscored CG (p < 0.001), indicating that there is a statistically significant effect of AI-based adaptive learning environment on programming skill acquisition.
Table 7
ANOVA results for learning performance
Source of variation
SS
df
MS
F-value
p-value
Between groups
65.10
1
65.10
21.75
< 0.001
Within groups
360.50
120
3.00
Total
425.60
121
Table 8 shows significant differences from all dimensions of engagement-behavioural, emotional, and cognitive-engagement between EG and CG (p < 0.001). That is, the subjects in the EG showed much higher levels of engagement in learning, thereby supporting the objective of adaptive learning environments of keeping students interested and motivated in their studies.
Table 8
ANOVA results for engagement levels
Engagement dimension
F-value
p-value
Behavioural
15.32
< 0.001
Emotional
12.74
< 0.001
Cognitive
20.45
< 0.001
All engagement metrics had much larger values on the EG, suggesting how well adaptive, targeted content can maintain students'engagement.
To assess how socio-economic conditions could have an impact, ANCOVA on socio-economic background as a covariate was utilized.
Table 9 confirms that there is a very strong positive effect of AI-based adaptive learning on programming performance (p < 0.001), unconfounded by socio-economic background.
Table 9
ANCOVA results for learning performance
Source of variation
SS
df
MS
F-value
p-value
Group (EG vs CG)
60.50
1
60.50
19.80
< 0.001
Socio-economic background
20.10
1
20.10
6.55
0.012
Interaction (group * socio-economic)
1.20
1
1.20
0.40
0.526
Error
350.30
118
2.97
Total
432.10
121
Table 10 verifies that engagement levels were much higher within the EG, although socio-economic background also had its contribution, most significantly on cognitive and behavioural engagement.
Table 10
ANCOVA results for engagement levels
Engagement dimension
Group
Socio-economic
F-value
p-value
F-value
p-value
Behavioural
13.21
< 0.001
5.10
0.025
Emotional
10.43
< 0.001
4.32
0.039
Cognitive
18.76
< 0.001
7.08
0.009
The results also indicate that adaptive instruction through AI generates greater learning performance and engagement than non-adaptive instruction. The adaptive properties of the system, including custom feedback, real-time monitoring of performance, and task revision, were behind these outcomes. The positive effects were also evident on students of diverse socio-economic levels, pointing to adaptive learning environments'promise of enhancing educational equity.
Discussion
This study demonstrates the tremendous promise of adaptive programming instruction through AI to enhance learning outcomes and motivation of socially disadvantaged students. The findings of this study point out that adaptive learning pathways, along with real-time, adaptive feedback, can successfully solve educational inequities, most of all among students who are beset by barriers of constrained resources and facilitation. The findings are contrasted within the context of earlier literature, while practical implications, responses to questions of the research, and directions of future inquiry are investigated.
Contextualizing the Findings Within Existing Literature
The outcomes of this research are in accordance with other studies regarding the benefits of adaptive learning systems. The potential of adaptive learning environments in individualized instruction, tailoring content for individual requirements can be directly viewed in the performance of the EG in this research (Graf, 2023; Peng et al., 2019). The performance improvement of the learners in the EG can be associated with the ability of the system to identify learners'shortcomings and offer them practice, mimicking the precepts of proven adaptive learning frameworks (Cavanagh et al., 2020).
In addition, adaptive systems can improve cognitive engagement by offering learners challenges of appropriate difficulty levels with immediate feedback (Dumitru, 2024; Sottilare, 2024). The heightened cognitive engagement demonstrated by the EG serves as a clear indicator of the efficacy of the implemented interventions. Besides, adaptive learning's positive effects on educational equity are validated by Chari (2024), who claimed that technological interventions had the capability of redressing socioeconomically disadvantaged conditions by ensuring equitable availability of educational resources (Chari, 2024).
This research contributes to the emerging evidence emphasizing the value of artificial intelligence systems in enhancing programming education, with particular benefits for underprivileged learners. The results align with the research of Becker et al. (2023) and Oyedokun (2025), reinforcing the significance of adaptive learning systems in delivering personalized and equitable learning experiences in varying schooling environments (Becker et al., 2023; Oyedokun, 2025).
Implications for Educational Practice
The findings have important practice implications, most notably concerning AI-based adaptive system use to enhance socially disadvantaged students'learning outcomes. The increases shown in learning outcomes and engagement demonstrate that adaptive learning can act as an effective method for closing educational deficits. The adaptive system's ability to personalize instruction to each learner's progress demonstrates its promise to fulfil differentiated learning needs, also espoused by Bernacki et al. (2021) concerning personalized learning technology (Bernacki et al., 2021).
Moreover, the evidence confirms that there is a requirement to make sure that there is adequate teacher preparation to successfully implement such systems. The adaptive algorithms have to be understood by educators, along with how to employ performance measures to make targeted interventions. Such necessity is confirmed by the perspectives of Kim (2024), where the primary emphasis was placed on ensuring that teachers were adequately prepared to integrate AI into their pedagogical practices (Kim, 2024).
The success of this AI-based system also proves how critical digital infrastructure is to educational equity. Low-income areas often encounter significant challenges, including limited access to technological resources, which constrains the effective application and benefits of implementing adaptive learning systems. The results also support the study of Cruz (2021), who observed that disparities in access to digital educational resources disproportionately affect minority and low-income students (Cruz, 2021).
Addressing the Research Questions
The study examined two primary research questions to understand student outcomes and student engagement, and the consequences of teaching with AI-driven adaptive programming.
RQ1: Does AI-based adaptive programming education influence the learning outcomes for socially disadvantaged students compared to traditional, non-adaptive educational methods?
The results indicate that adaptive programming instruction using AI increases socially disadvantaged students'achievement considerably compared to non-adaptive instruction. The post-test measures indicated statistically significant programming skill enhancement of the EG, with a large effect size. The enhancement is likely to result from the system's capability of adapting content based on real-time measures of performance, offering students targeted assistance to fill specific weaknesses.
The ability of the AI system to provide immediate feedback played a critical role in this improvement. Feedback delivered during the learning process helped students correct misconceptions and reinforce their understanding of programming concepts. This finding aligns with Niyibizi and Mutarutinya (2024), who highlighted the importance of immediate, actionable feedback in enhancing learning outcomes (Niyibizi & Mutarutinya, 2024). The results further confirm the hypothesis that adaptive learning environments, by responding to individual learning patterns, can significantly improve academic performance in programming education for disadvantaged students.
RQ2: To what extent does the delivery of adaptive programming education using AI technology affect student engagement, compared to traditional methods, in terms of behavioural, emotional, and cognitive dimensions?
The findings reflect that AI-based adaptive programming had a very positive effect on students'engagement on behavioural, emotional, and cognitive levels. The EG surpassed the CG on each measure of engagement, with considerable effect sizes on each dimension.
The increase in engagement activity also demonstrates how well active engagement and task completion is facilitated by the system. The adaptive, custom-designed form of feedback is most likely to have encouraged students to meaningfully engage with content, also evidenced by Ellikkal and Rajamohan (2024). Emotional engagement also demonstrated significant enhancement, suggesting that adaptive use of the system minimized frustration levels and generated sustained motivation through its supportive, interactive features. The most apparent enhancement seemed to have occurred within cognitive engagement, where the EG had higher levels of critical reasoning skill, along with levels of problem-solving skill levels. The findings are also evidenced by Halkiopoulos and Gkintoni (2024), who demonstrated that adaptive task difficulty and custom-designed feedback can activate higher-order cognitive processes (Halkiopoulos & Gkintoni, 2024).
Broader Implications for Educational Equity
The research findings provide compelling support for the adoption of AI-driven adaptive learning to address educational equity. Socially disadvantaged learners are commonly faced with systematic barriers towards academic achievements, including limited access, lower levels of contribution from parents, and fewer experiences with computer and communication technologies. The research demonstrates that adaptive systems can overcome such gaps, partly, with personalized teaching matched with the individual needs of learners.
The findings here substantiate that adaptive systems can fill part of the gap by offering instruction that is tailored to each learner's unique needs. The broader applicability of the findings is outside of programming instruction. With the continuous advancement of adaptive learning technologies, their applicability extends across a wide range of subjects and instructional levels. The success of the AI-based system here suggests that its process can also be extended to other STEM areas where intangible, complex ideas are most likely to perplex disadvantaged students.
The digital divide bridging power of adaptive learning is phenomenal. By providing every learner—regardless of their background—with high-quality, adaptive learning experiences, educational institutions can foster more equitable, inclusive, and fair learning environments. This is also corroborated by Oye et al. (2024), who highlights the pivotal role of adaptive learning technology in supporting the diverse needs of learners (Oye et al., 2024).
The findings of this current study give strong evidence that AI adaptive programming instruction can have great impacts on learning outcomes and learner engagement of socially disadvantaged students. The adaptive, tailored features of the AI-based system permitted students to experience targeted interventions, adjust learning content to skill levels, and actively engage with educational content.
The findings establish how adaptive learning environments can ensure educational equity, especially among students who have systemic barriers to school success. By offering students adaptive learning activities and targeted feedback, adaptive learning environments fill the digital divide and give disadvantaged students useful digital competency.
Limitations and Future Research
This research provides significant insights into the role of AI-supported adaptive learning that can enhance programming learning results for underprivileged learners. Nevertheless, several limitations have emerged, which suggest the need for future research in this field.
One of the limitations of this research lies in the relatively short duration of the experimental period of 13 weeks. Although the 13-week period allowed for the observation of improvements in programming knowledge and student engagement, it presents limited insight into the long-term sustainability of these gains. Extended studies spanning multiple semesters or academic years are necessary to determine the persistence of the benefits of adaptive learning over time. Such long-term research would provide a deeper insight into the efficiency of adaptive learning for disadvantaged students. Examining the persistence of these effects can clarify how adaptive learning methods influence students’ attitudes and learning performance.
An additional limitation pertains to the potential for selection bias. Students were recruited based on their responses to a survey and their SCI. While using the SCI permitted systematic classification of the students, the reliance on self-reported data may also have introduced inherent biases. Furthermore, although the pre-test aimed to equate baseline programming knowledge between the two groups, unreported prior experience with programming outside of the classroom, as well as varying levels of intrinsic motivation, may have affected the data. A future study could circumvent this limitation with more objective means of participant selection, such as the use of standardized tests or academic transcripts for confirmation of pre-existing knowledge and classification of participants.
The generalizability of the research can also be problematic. The research for this study was conducted on computer programming in higher levels of education. The research suggests that, in this context, adaptive learning supported by artificial intelligence could be beneficial for improving student performance and engagement; however, it remains unclear whether these results can be generalized to other academic disciplines or educational levels. Further research exploring the application of AI-based adaptive learning systems in other subjects, such as mathematics, foreign languages, and natural sciences, can provide more generalizable insights regarding their effectiveness. Moreover, applying their use in primary and secondary education could help determine whether the observed benefits extend to different educational levels and learning environments.
Throughout the study, a variety of key factors were monitored with caution to decrease the likelihood of confounding. The pre-tests quantified participants'prior knowledge of programming, and according to the results, the students in both the EG and the CG had similar pre-test knowledge when beginning the course. The socio-economic status of the students was also quantified via the SCI, and it showed similar group distributions between the EG and the CG. The participants'access to technical means was equalized by providing similar computer hardware, computer packages, and access to the Internet for use during the sessions. What is more, the sessions were conducted in the same physical environment to decrease extrinsic factors such as noise levels and lighting conditions. The instructors did not vary either, as the same instructor delivered both the adaptive and traditional teaching methods, following identical lesson plans and instructional guides. These precautions were undertaken to minimize the influence of extrinsic factors on the adaptive learning intervention, thereby isolating its effect with maximum accuracy. Despite these precautions, unmeasured factors, such as individual differences in study approaches and cognitive abilities, may still have influenced the outcomes.
The adaptive mechanisms also leave several questions unanswered. The strategy of utilizing machine learning methods for adapting experience based on student performance in real time raises important questions regarding the specific mechanisms of personalization that contributed to increased engagement and performance or the question whether the observed gains were the result of personalized feedback, dynamically adjusted task difficulty, or motivational cues following periods of inactivity. To address these questions, future research should explore the relative contributions of these factors to learning outcomes, potentially by systematically disabling individual components to assess their individual effects on student performance.
Another concern pertains to the potential for algorithmic bias within the AI system. The historical performance data, based on which the personalized learning experience algorithms were designed, may be imbued with inherent socio-economic status and other demographic-related biases. If not addressed, these may end up disadvantaging some student populations, undermining the purpose of the system, which is to promote greater educational equity. The future research must investigate ways of identifying and limiting such biases, for instance, via fairness-aware algorithms or regular audits of the performance of the system across demographic populations.
Similarly, the measurement of student engagement is limited by its reliance on self-reported data, which may be subject to biases. Although the student engagement survey of this study had its anchors grounded in proven theories and piloted for clarity, there can be memory contamination, recollections, and contamination from social desirability. To boost the reliability of measurements of student engagement, future research can use objective indicators, such as clickstream data, time-on-task, and physiological data such as heart rate variability and eye-tracking. These complementary measurements can provide a more comprehensive understanding of student behaviour with adaptive learning environments, as well as provide deeper insights into student engagement and academic performance.
The findings of this study also suggest the need for further research into the pedagogical value of widespread adoption of AI-driven adaptive learning systems. The motivational and performance gains indicated suggest that data-driven, personalized teaching supports can be of significant value for assisting socio-disadvantaged learners. But their adoption into normal teaching practice depends not only on technical complexity, but also on teachers'abilities to incorporate them into their educational practice. Research on teacher training, instructional design methods, and best practices for incorporating adaptive learning systems into varying teaching environments could bridge the gap between technical advancement and practice. This study provides empirical confirmation of the efficacy of adaptive learning with AI for improving programming skills and student engagement for socio-disadvantaged learners.
The relatively short study period, the possibility of selection bias, and the use of student-reported data on student engagement, however, limit the generalizability of such findings. Future research should extend the study period and explore additional educational settings, incorporating objective measures of student engagement to provide a more comprehensive assessment. Such extensions would ensure future research to deepen understanding of how adaptive learning systems function across diverse learner populations, thereby enhancing the generalizability and applicability of the findings.
Conclusion
The results confirm that AI-driven adaptive programming education provides a significant rise in learning outcomes and increases engagement among students from socially deprived backgrounds. Many of the observed benefits, such as increased student engagement and improved academic results, can be attributed to the provision of personalized feedback and real-time individualized learning pathways ensured by the AI-driven adaptive system. The results show that AI-supported learning environments can induce the reduction of educational inequities while enhancing subject-specific learning, such as programming, to a high degree of proficiency.
AI-powered adaptive learning forms offer a promising solution for educational systems that struggle ensuring equitable access to quality education for all students. By leveraging technology, educational institutions can close achievement gaps, providing both equity and efficiency in learning. Such systems enable the delivery of personalized and scalable instruction, particularly benefiting students who require additional support to achieve academic success.
Declarations
Ethical Approval
This study was conducted in compliance with all applicable ethical standards.
Consent to Participate
All participants provided informed consent to participate in the study.
Consent to Publish
No identifiable information was gathered, so publication consent is not applicable.
Informed Consent
Informed consent was obtained from all participants involved in the study before they completed the questionnaire.
Human and Animal Rights
All procedures involving human participants adhered to ethical guidelines.
Competing interests
The authors declare there are no competing or financial interests related to this work.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix 1. Questionnaire to Assess Social Disadvantage
This questionnaire assesses the socio-economic background of participants to calculate the Social Condition Index (SCI). Each item is rated on a 5-point Likert scale:
I actively participated in the programming exercises and tasks.
I dedicated sufficient time to completing tasks and studying materials.
I applied the feedback from the AI system to improve my assignments.
I consistently completed assigned tasks on time.
2.
Emotional engagement
I felt motivated while working on the programming tasks.
I was interested in learning more about programming concepts.
I rarely felt frustrated while using the AI system.
I enjoyed interacting with the AI-based adaptive learning system.
3.
Cognitive Engagement
I made an effort to understand programming concepts beyond the tasks.
I applied critical thinking to solve programming problems.
I reflected on what I learned after completing assignments.
I found solving programming problems enjoyable and stimulating.
Atasoy, H., Banker, R. D., & Pavlou, P. A. (2021). Information technology skills and labor market outcomes for workers. Information Systems Research,32(2), 437–461. https://doi.org/10.1287/isre.2020.0975CrossRef
Baker, R., & Siemens, G. (2014). Learning analytics and educational data mining. In R. K. Sawyer (Ed.), Cambridge handbook of the leaning sciences, 253–272. Cambridge University Press. https://doi.org/10.1017/CBO9781139519526.016
Becker, B. A., Denny, P., Finnie-Ansley, J., Luxton-Reilly, A., Prather, J., & Santos, E. A. (2023). Programming is hard-or at least it used to be: Educational opportunities and challenges of ai code generation. In Proceedings of the 54th ACM Technical Symposium on Computer Science Education V. 1 (pp. 500–506). https://doi.org/10.1145/3545945.3569759
Bernacki, M. L., Greene, M. J., & Lobczowski, N. G. (2021). A systematic review of research on personalized learning: Personalized by whom, to what, how, and for what purpose (s)? Educational Psychology Review,33(4), 1675–1715. https://doi.org/10.1007/s10648-021-09615-8CrossRef
Cavanagh, T., Chen, B., Lahcen, R. A. M., & Paradiso, J. R. (2020). Constructing a design framework and pedagogical approach for adaptive learning in higher education: A practitioner’s perspective. International Review of Research in Open and Distributed Learning,21(1), 173–197. https://doi.org/10.19173/irrodl.v21i1.4557CrossRef
Chari, S. G. (2024). Bridging gaps, building futures: Tackling socio-economic disparities through education and technology. London Journal of Research in Humanities and Social Sciences,24(16), 1–12.
Colliot, T., Krichen, O., Girard, N., Anquetil, É., & Jamet, É. (2024). What makes tablet‐based learning effective? A study of the role of real‐time adaptive feedback. British Journal of Educational Technology. https://doi.org/10.1111/bjet.13439
Cruz, C. (2021). From digital disparity to educational excellence: Closing the opportunity and achievement gaps for low-income, Black and Latinx students. Harvard Latinx Law Review,24, 33.
Demartini, C. G., Sciascia, L., Bosso, A., & Manuri, F. (2024). Artificial intelligence bringing improvements to adaptive learning in education: A case study. Sustainability,16(3), 1347. https://doi.org/10.3390/su16031347CrossRef
Di Battista, A., Grayling, S., Hasselaar, E., Leopold, T., Li, R., Rayner, M., & Zahidi, S. (2023). Future of jobs report 2023. In World Economic Forum, Geneva, Switzerland. https://www.weforum.org/reports/the-future-of-jobs-report-2023. Accessed 08/12/2024.
Duarte-García, M. A., & Sigal-Sefchovich, J. R. (2019). Working with electroacoustic music in rural communities: The use of an interactive music system in the creative process in primary and secondary school education. Organised Sound,24(3), 228–239. https://doi.org/10.1017/S135577181900030XCrossRef
Dumitru, C. (2024). Future of learning: Adaptive learning systems based on language generative models in higher education. In Impact of artificial intelligence on society (pp. 31–42). Chapman and Hall/CRC. https://doi.org/10.1201/9781032644509-3
Duraes, D., Bezerra, R., & Novais, P. (2024). AI-driven educational transformation in secondary schools: Leveraging Data Insights for Inclusive Learning Environments. In 2024 IEEE Global Engineering Education Conference (EDUCON) (pp. 1–9). IEEE.
Ellikkal, A., & Rajamohan, S. (2024). AI-enabled personalized learning: Empowering management students for improving engagement and academic performance. Vilakshan-XIMB Journal of Management. https://doi.org/10.1108/XJM-02-2024-0023CrossRef
Fernandes, C. W., Rafatirad, S., & Sayadi, H. (2023). Advancing personalized and adaptive learning experience in education with artificial intelligence. In 2023 32nd Annual Conference of the European Association for Education in Electrical and Information Engineering (EAEEIE) (pp. 1–6). IEEE. https://doi.org/10.23919/EAEEIE55804.2023.10181336
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research,74(1), 59–109. https://doi.org/10.3102/00346543074001059CrossRef
Gomes, S., Lopes, J. M., & Ferreira, L. (2022). The impact of the digital economy on economic growth: The case of OECD countries. RAM. Revista de Administração Mackenzie,23(6), eRAMD220029. https://doi.org/10.1590/1678-6971/eRAMD220029.enCrossRef
Graf, A. (2023). Exploring the role of personalization in adaptive learning environments. International Journal Software Engineering and Computer Science (IJSECS),3(2), 50–56. https://doi.org/10.35870/ijsecs.v3i2.1200CrossRef
Halkiopoulos, C., & Gkintoni, E. (2024). Leveraging AI in e-learning: Personalized learning and adaptive assessment through cognitive neuropsychology—A systematic analysis. Electronics,13(18), 3762. https://doi.org/10.3390/electronics13183762CrossRef
Heffernan, N. T., Ostrow, K. S., Kelly, K., Selent, D., Van Inwegen, E. G., Xiong, X., & Williams, J. J. (2016). The future of adaptive learning: Does the crowd hold the key? International Journal of Artificial Intelligence in Education,26, 615–644. https://doi.org/10.1007/s40593-016-0094-zCrossRef
Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education,90, 36–53. https://doi.org/10.1016/j.compedu.2015.09.005CrossRef
Holmes, W., Bialik, M., & Fadel, C. (2019). Artificial intelligence in education promises and implications for teaching and learning. Center for Curriculum Redesign.
Inegbedion, H. E. (2021). Digital divide in the major regions of the world and the possibility of convergence. The Bottom Line,34(1), 68–85. https://doi.org/10.1108/BL-09-2020-0064CrossRef
Jing, Y., Zhao, L., Zhu, K., Wang, H., Wang, C., & Xia, Q. (2023). Research landscape of adaptive learning in education: A bibliometric study on research publications from 2000 to 2022. Sustainability,15(4), 3115. https://doi.org/10.3390/su15043115CrossRef
Kuzenkov, S. (2024). The possibilities of using artificial intelligence technologies in education and science. In 2024 4th International Conference on Technology Enhanced Learning in Higher Education (TELE) (pp. 88–91). https://doi.org/10.1109/TELE62556.2024.10605701
Labudova, V., & Fodranova, I. (2024). The impact of socio-economic factors on digital skills in the population of the EU 27 countries. Virtual Economics,7(3), 81–101. https://doi.org/10.34021/ve.2024.07.03(5)CrossRef
Lamberti, G., Lopez-Sintas, J., & Sukphan, J. (2021). The social process of internet appropriation: Living in a digitally advanced country benefits less well-educated Europeans. Telecommunications Policy,45(1), 102055. https://doi.org/10.1016/j.telpol.2020.102055CrossRef
Luckin, R., & Holmes, W. (2016). Intelligence unleashed: An argument for AI in education. Pearson.
Munir, J., Faiza, M., Jamal, B., Daud, S., & Iqbal, K. (2023). The impact of socio-economic status on academic achievement. Journal of Social Sciences Review,3(2), 695–705. https://doi.org/10.54183/jssr.v3i2.308CrossRef
Niyibizi, O., & Mutarutinya, V. (2024). Enhancing learning outcomes in mathematics education through innovative assessment methods and timely feedback. Journal of Mathematics and Science Teacher, 4(3). https://doi.org/10.29333/mathsciteacher/14584
Njeri, M., & Taym, A. (2024). Analysing the power of socioeconomic status on access to technology-enhanced learning in secondary schools. Research Studies in English Language Teaching and Learning,2(4), 223–250. https://doi.org/10.62583/rseltl.v2i4.55CrossRef
Ogunniyi, M. B. (2011). The context of training teachers to implement a socially relevant science education in Africa. African Journal of Research in Mathematics, Science and Technology Education,15(3), 98–121. https://doi.org/10.1080/10288457.2011.10740721CrossRef
Oye, E., Frank, E., & Owen, J. (2024). Personalized learning ecosystem for diverse learners.
Oyedokun, T. T. (2025). Assistive technology and accessibility tools in enhancing adaptive education. In Advancing adaptive education: Technological innovations for disability support (pp. 125–162). IGI Global Scientific Publishing. https://doi.org/10.4018/979-8-3693-8227-1.ch006
Pelletier, K., Robert, J., Muscanell, N., McCormack, M., Reeves, J., Arbino, N., Grajek, S., Birdwell, w.T., Liu, D., Mandernach, J., Moore, A., Porcaro, A., Rutledge, R., & Zimmern, J. (2023). 2023 EDUCAUSE Horizon Report Teaching and Learning Edition. EDUCAUSE23. https://www.learntechlib.org/p/222401/. Accessed 08/12/2024.
Peng, H., Ma, S., & Spector, J. M. (2019). Personalized adaptive learning: An emerging pedagogical approach enabled by a smart learning environment. In M. Chang, R. Huang, G. Wills, Kinshuk, & Y. Li (Eds.), Foundations and trends in smart learning (pp. 203–213). Springer. https://doi.org/10.1007/978-981-13-6908-7_24CrossRef
Rayevnyeva, O., Ponomarenko, V., Matusova, S., Stryzhychenko, K., Filip, S., & Brovko, O. (2024). Models of the impact of socio-economic shocks on higher education development. Administrative Sciences,14(11), 278. https://doi.org/10.3390/admsci14110278CrossRef
Sari, H. E., Tumanggor, B., & Efron, D. (2024). Improving educational outcomes through adaptive learning systems using ai. International Transactions on Artificial Intelligence,3(1), 21–31. https://doi.org/10.33050/italic.v3i1.647CrossRef
Seo, K., Tang, J., Roll, I., Fels, S., & Yoon, D. (2021). The impact of artificial intelligence on learner–instructor interaction in online learning. International Journal of Educational Technology in Higher Education,18, 1–23. https://doi.org/10.1186/s41239-021-00292-9CrossRef
Sottilare, R. A. (2024). Adaptive learning, training, and education. In Human-computer interaction in various application domains (pp. 144–172). CRC Press.
Strielkowski, W., Grebennikova, V., Lisovskiy, A., Rakhimova, G., & Vasileva, T. (2024). AI-driven adaptive learning for sustainable educational transformation. Sustainable Development. https://doi.org/10.1002/sd.3221CrossRef
Taber, K. S. (2018). The use of Cronbach’s alpha when developing and reporting research instruments in science education. Research in Science Education, 48, 1273–1296. https://link.springer.com/article/10.1007/s11165-016-9602-2. Accessed 08/12/2024.
Tandika, P. B., & Ndijuye, L. G. (2019). Pre-primary teachers’ preparedness in integrating information and communication technology in teaching and learning in Tanzania. Information and Learning Sciences,121(1/2), 79–94. https://doi.org/10.1108/ILS-01-2019-0009CrossRef
Tariq, M. U. (2025). Innovating education and entrepreneurship through computational thinking: New pathways for economic development. In Entrepreneurial ecosystems driving economic transformation and job creation (pp. 1–28). IGI Global Scientific Publishing. https://doi.org/10.4018/979-8-3693-9486-1.ch001
Taylor, D. L., Yeung, M., spsampsps Bashet, A. Z. (2021). Personalized and adaptive learning. Innovative learning environments in STEM higher education: Opportunities, challenges, and looking forward (pp. 17–34). https://doi.org/10.1007/978-3-030-58948-6_2
Troussas, C., Krouska, A., & Virvou, M. (2019). Adaptive e-learning interactions using dynamic clustering of learners’ characteristics. In 2019 10th international conference on information, intelligence, systems and applications (iisa) (pp. 1–7). IEEE. https://doi.org/10.1109/IISA.2019.8900722
UNESCO. (2021). Reimagining our futures together: A new social contract for education. United Nations Educational, Scientific and Cultural Organization. https://doi.org/10.54675/ASRB4722
Volchik, V., Oganesyan, A., & Olejarz, T. (2018). Higher education as a factor of socio-economic performance and development. Journal of International Studies, 11(4). https://doi.org/10.14254/2071-8330.2018/11-4/23
Xiaoyu, Z., & Tobias, T. C. (2023). Exploring the efficacy of adaptive learning technologies in online education: A longitudinal analysis of student engagement and performance. International Journal of Science and Engineering Applications,12(12), 28–31. https://doi.org/10.4018/979-8-3693-2440-0.ch008CrossRef