Finds documents with both search terms in any word order, permitting "n" words as a maximum distance between them. Best choose between 15 and 30 (e.g. NEAR(recruit, professionals, 20)).
Finds documents with the search term in word versions or composites. The asterisk * marks whether you wish them BEFORE, BEHIND, or BEFORE and BEHIND the search term (e.g. lightweight*, *lightweight, *lightweight*).
The study investigates the effectiveness of interactive learning environments (ILEs) in science education, focusing on AR, VR, MIX, and GAME. Using network meta-analysis, the research compares the impact of these technologies on cognitive and affective learning outcomes. The findings reveal that AR is the most effective format for both cognitive and affective outcomes, followed by MIX for cognitive outcomes and GAME for affective outcomes. The study highlights the potential of these technologies to enhance learning experiences and motivate students, while also identifying areas for future research to address existing gaps in the literature.
AI Generated
This summary of the content was generated with the help of AI.
Abstract
This study used network meta-analysis to investigate the impact of the use of interactive learning environments (ILE) tools (augmented reality (AR), virtual reality (VR), mixed reality (MIX), and interactive digital games (GAME)) in science education on learning outcomes. A total of 53 primary studies were retrieved from the literature according to the inclusion/exclusion criteria. According to this, MIX demonstrated the largest effect size compared to conventional teaching (CON) for both cognitive and affective outcomes. AR exhibited a larger effect size than VR for affective outcomes but did not differ significantly from VR or GAME for cognitive outcomes. GAME outperformed CON for cognitive outcomes but did not differ significantly from VR or GAME for either outcome. VR’s effect size on cognitive outcomes was not significantly different from CON but was significantly higher for affective outcomes. Indirect comparisons revealed no significant differences between MIX and the other ILE formats for either outcome. Network analysis ranked AR as the most effective format for both cognitive and affective outcomes. These findings highlight the potential of ILEs, particularly AR, for enhancing learning outcomes. Limitations of the study include the lack of direct comparisons for MIX, high heterogeneity, and publication bias in some binary comparisons. More primary studies are needed to address these limitations and increase the generalizability of the findings.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Introduction
With the advancement of technology, conventional teaching methods have begun to lose their effectiveness over students, gradually giving way to interactive learning environments that incorporate technology. An interactive learning environment (ILE) refers to a setting created using various specialized software or hardware to support learning and teaching in educational institutions (Psotka, 2012). Unlike conventional teaching methods, interactive learning environments require greater student participation and appeal to multiple senses. The increasing presence of digital technologies in our lives has also led to the widespread use of these tools in teaching processes (Kordaki, 2010; Pivec, 2007; Prensky, 2012). In these learning environments that integrate technology, students actively engage with stimuli from technological applications and become involved in the learning process. In the increasingly common interactive learning applications (Baker et al., 2008), the teacher is responsible for planning and delivering technological content, while students are responsible for generating and sharing knowledge (Sessoms, 2008). These new learning environments, which emerge as an alternative to conventional teaching methods, also necessitate that educators explore and apply current digital educational tools and applications in their teaching processes (Al-Mashhadani & Al-Rawe, 2018). Examples of applications used in interactive learning environments include interactive digital games (GAME) (Efe & Umdu-Topsakal, 2022), as well as augmented reality (AR), virtual reality (VR), and mixed reality (MIX) applications used in the process.
Technological advancements in science education have gained increasing prominence, aiming to enrich learning experiences and facilitate students’ deeper understanding of concepts. Computers, mobile phones, interactive whiteboards, videos, multimedia applications, interactive digital games and learning platforms, simulations, VR, the Internet, and Web 2.0 applications are examples of technologies effectively utilized by teachers and students in educational settings (Dror, 2008). In recent years, the potential of advanced technologies like AR, VR, and MIX in education has emerged as an exciting area for transforming students’ learning processes (Sala, 2021). Over time, due to advancements in these technologies and the various definitions proposed for the concept of mixed reality, the need to update the virtual continuum concept proposed by Milgram & Kishino (1994) has emerged. Milgram & Kishino (1994) defined mixed reality as “an environment in which real-world and virtual-world objects are presented together on a single display.” However, Skarbez et al. (2021) argued that this definition needs to be updated and proposed a new definition: “an environment in which real-world and virtual-world objects and stimuli are presented together within a single percept.” In other words, for an environment to be considered mixed reality, it is stated that the user must perceive real and virtual content simultaneously, including through different senses. In this context, the new taxonomy proposed by the authors for Milgram & Kishino’s (1994) virtual continuum is presented in Fig. 1. Accordingly, it is suggested that interactive digital games can be situated at different levels of this taxonomy, depending on both the extent to which they use real-world knowledge and the degree to which they immerse the user in the virtual environment.
Fig. 1
Three dimensional virtual continuum taxonomy (Skarbez et al., 2021)
In the context of science education, it is crucial to gain a comprehensive understanding of the impact of ILE on learning outcomes. Focusing solely on one of the ILE technologies limits the opportunity to develop a comprehensive pedagogical framework that encompasses the consumption, creation, and transformation of information across different realities and dimensions (Maas & Hughes, 2020). This study aims to examine the role of conventional teaching methods and ILE like VR, AR, MIX, and GAME in science education and evaluate their effects on learning outcomes through network meta-analysis. This comprehensive analysis aims to provide insights into the relative ranking of the interventions implemented in the interactive learning environments, helping researchers, educators, and policymakers better understand their effectiveness in science instruction.
Augmented Reality
AR is a technology that overlays virtual objects onto the real world, merging the two environments seamlessly (Azuma, 1997). It allows for the simultaneous integration of the virtual and real worlds in two- or three-dimensional forms within the actual environment (Costa et al., 2019; Gybas et al., 2019; Ibáñez & Delgado-Kloos, 2018). By adding virtual features to real-world objects (Kerawalla et al., 2006), AR enhances their dynamism (Cheng & Tsai, 2020). This ability to perceive more information compared to a normal environment necessitates certain fundamental requirements for AR (Azuma, 1997). These include the following: (i) incorporating a combination of real and virtual environments, (ii) enabling real-time interaction, and (iii) involving three-dimensional objects. Pioneered by cinematographer Norton Heilig in 1957–1962 with the Sensorama console, AR has found applications in diverse sectors, including tourism, military, healthcare, advertising, and education, driven by technological advancements (Javornik, 2016; Johnson et al., 2011; Martin et al., 2011; Yen et al., 2013). The ability to utilize AR without specialized equipment and instead employ various mobile devices has facilitated its widespread adoption in educational settings (Dalim et al., 2017; Elfitra Mansyur & Taufik, 2021; Sirakaya & Alsancak-Sirakaya, 2018; Wu et al., 2013).
AR in science education can help make abstract concepts more concrete and provide students with a deeper understanding of the subjects. AR applications that offer visually rich and interactive content, in particular, enable students to engage more closely with scientific topics and enhance their motivation (Akcayir & Akcayir, 2017; Dunleavy & Dede, 2014). A review of studies examining the use of AR in science education reveals its positive impact on both cognitive (e.g., academic achievement, problem-solving, and decision-making skills) (Arici & Yilmaz, 2023; Chiang et al., 2014; Yildirim & Seckin-Kapucu, 2021) and affective (e.g., interest, attitude, motivation) (Akcayir et al., 2016; Chen, 2020; Chiang et al., 2014; Maulana, 2020; Sirakaya & Alsancak-Sirakaya, 2018; Sumadio & Rambli, 2010; Yildirim, 2020; Wojciechowski & Cellary, 2013) learning outcomes.
Virtual Reality
VR refers to an adapted or simulated environment that users can explore and interact with (Makransky & Lilleholt, 2018). The key characteristics that distinguish VR applications from other technologies are immersion, interaction, and imagination (Burdea & Coiffet, 2003). The concept, which emerged in the second half of the nineteenth century, has gradually become widely used in the fields of education, teaching, and entertainment (Chen & Hsu, 2020). Today, VR is utilized in various areas, including gaming, aviation, military training, simulation of surgical procedures, psychological treatment, learning, and social skills (Baceviciute et al., 2021; Chen & Hsu, 2020; Cipresso et al., 2018). VR can be broadly classified into two main types: immersive (IVR) and desktop (DVR) (Matovu et al., 2023); high-immersion and low-immersion (Lee & Wong, 2014); or immersive and non-immersive VR (Liu et al., 2020; Makransky & Lilleholt, 2018). IVR isolates users from the real environment, providing them with a complete sense of being in a virtual world, and can be utilized with head-mounted displays (HMDs). In contrast, DVR offers limited interaction and immersion, as it does not isolate users from the real environment. In a DVR setting, users view the content on a desktop screen without being abstracted from the physical environment and can interact using tools such as keyboards and mice (Lee & Wong, 2014). The interactive nature of VR environments enriches the user experience, while the immersive feature is defined as the sensation of being in the virtual environment while interacting with it (Mikropoulos & Natsis, 2011). VR has attracted the attention of many educators because it allows for the simulation of the real world for learning purposes (Parong & Mayer, 2018). Based on its widespread use, it is predicted that VR will soon be integrated into conventional classrooms (Calvert & Abadia, 2020). Moreover, VR is seen as offering significant opportunities in various disciplines in the future, making it worth investigating (Shin, 2017).
Advertisement
VR in science education provides students with the opportunity to explore scientific phenomena and environments that are difficult to experience directly. For example, students can observe atomic-level interactions or travel into the depths of space using VR. This full immersion offered by VR can enhance learning motivation and interest in scientific topics (Merchant et al., 2014). An examination of studies investigating the use of VR in science education reveals its positive impact on both cognitive (Ferdinand et al., 2023; Liu et al., 2020) and affective (Al-Amri et al., 2020; August et al., 2016; Chen, 2020; Cheng & Tsai, 2020; Wu et al., 2021) learning outcomes.
Mixed Reality
MIX, a concept initially introduced by Milgram & Kishino (1994), describes the continuum between fully virtual and fully real environments. MIX refers to a real environment that allows for real-time interaction with virtual experiences (Cheok et al., 2009; Holz et al., 2011; Milgram & Kishino, 1994). It encompasses a broad definition that includes a range of applications, from digital layers in camera views to the use of physical objects for interaction with digital screens and the development of virtual reality interactions with haptic feedback (Lindgren & Johnson-Glenberg, 2013). MIX provides students with an augmented learning environment that enables them to manipulate non-real objects to gain understanding and knowledge (Weng et al., 2019). These environments not only offer an immersive technology experience that places the student within the system and makes them a part of the simulation but also feature an interface that can respond to students’ natural movements and physical interactions (Lindgren & Johnson-Glenberg, 2013).
MIX in science education offers the opportunity to make abstract concepts more concrete while benefiting from the immersive effect of fully virtual experiences. This integrated structure provides students with a richer and more interactive learning environment and appeals to different learning styles (Johnson-Glenberg, 2018). Research on the use of MIX in science education suggests that MIX leads to improved learning outcomes compared to conventional teaching (Johnson-Glenberg & Megowan-Romanowicz, 2017; Johnson-Glenberg et al., 2014). While MIX is generally more engaging than conventional teaching, the effects of MIX on learning outcomes remain unclear due to the limited research conducted in this area (Weng et al., 2019).
Interactive Digital Games
This study focuses on the concept of games that can be played in a digital environment for educational purposes, encompassing terms such as “digital games,” “online games,” “virtual games,” and “serious games.” In this context, the game definition adopted for this study is that of interactive digital games. Interactive digital games supported by AR, VR, and MR technologies, unlike AR, VR, and MIX technologies, aim to provide students with active and fun participation in the learning process with gamification elements. These games offer a more interactive and motivating learning experience than other technologies with their features that allow students to directly direct the learning process and increase problem-solving and discovery skills. In other words, digital games are defined as “systems that players exert effort in to achieve variable outcomes or scores, adhering to specific rules” (Clark et al., 2016). These games supported by AR, VR, and MR increase student participation, making the learning process more effective and motivating (Annetta et al., 2009). Interactive digital games create an engaging atmosphere by employing animation, sound effects, and advanced graphics to enable players to disregard external distractions that might divert their attention. Additionally, players can immerse themselves in the game, feeling like the protagonist, integrating their entire attention, thoughts, and even emotional connections with the game (Hsu & Cheng, 2014). In the educational context, digital games transform into educational tools that aim for learning alongside entertainment (Haruna et al., 2019). Until the late 1990s, the general perception of games in the field of education was negative, leading to a lack of significant attention (Gros, 2007). However, the publication of the book “Digital Game-Based Learning” by Prensky (2001) introduced a new perspective on the use of digital games in education.
A review of studies examining the use of GAME in science education reveals their positive impact on both cognitive (Hussein et al., 2019; Koops et al., 2016; Kuznetcova et al., 2023) and affective (Anderson & Barnett, 2013; Hiangsa et al., 2015; Liu et al., 2023; Mystakidis et al., 2022) learning outcomes.
Previous Meta-Analyses on ILE
A summary of meta-analysis studies on the effects of ILEs in science education is presented in Table 1.
IVR is more effective than HMDs with a small effect size (ES = 0.24). HMDs can improve both knowledge and skill development and maintain the learning effect over time
GAME a medium significant positive effect on students’ academic achievement
*k, the total number of effect sizes obtained from the studies; ES, effect size
To the best of our knowledge, no study has comprehensively evaluated the impact of ILEs on learning outcomes in science education using network meta-analysis (NMA). While a recent study employed meta-analysis to examine the effects of AR, VR, and MIX applications on learning outcomes (Kaplan et al., 2021), researchers have typically reported the impact of different applications on learning outcomes on an application-by-application basis. This study aims to contribute to the literature by restricting its focus to the field of science education and employing NMA to report the impact of ILEs on learning outcomes, offering a comprehensive perspective on the subject.
Research Questions
1.
What is the comparative effectiveness of the ILE formats on cognitive and affective learning outcomes in science education?
2.
What are the relative rankings of interventions implemented in the ILE?
Methodology
Network meta-analysis (NMA) was employed to determine the comparative impact of ILE formats on learning outcomes in science education. Unlike traditional meta-analysis, NMA can directly and indirectly compare more than two interventions simultaneously (Cuijpers et al., 2019). This allows for the estimation of the relative effect of each intervention under investigation and the identification of the most suitable intervention for a given population (Harrer et al., 2021, p.332).
Literature Search Procedure
Studies on the use of AR, VR, MIX, and GAME in science education were identified by searching the ERIC, Science Direct, Scopus and Web of Science databases using binary combinations of the keywords “augmented reality,” “virtual reality,” “mixed reality,” “digital game,” “online game,” “virtual game,” and “serious game” with “science education” or “science teaching.” All identified studies were briefly reviewed. A total of 477 scientific studies were found to have been conducted between 2012 and 2023, following an initial search in October 2023 and a supplementary search in February 2024.
Inclusion/Exclusion Criteria
The scientific articles identified through the literature search were selected based on the inclusion and exclusion criteria specified in Table 2.
Table 2
Inclusion/exclusion criteria
Criteria
Inclusion
Exclusion
Publication year
Studies having published between the years 2012 and 2023
Studies having published after the year 2023
Publication type
Scientific research articles published in international peer-reviewed journals
Studies other than scientific research articles (e.g., masters/doctoral thesis, book chapter, conference papers)
Language
Scientific research articles published in English
Scientific studies published in languages other than English
Research design
Semi/full experimental scientific article studies conducted with experimental and control groups
Studies other than semi/full experimental scientific article studies conducted with experimental and control groups
Subject area
Scientific article studies conducted with in the field of science education
Scientific studies conducted with in fields other than science education
Examined variable (dependent variable)
Scientific article studies evaluating cognitive and affective learning outcomes
Scientific studies evaluating variables other than cognitive and affective variables
Application used (independent variable)
Scientific article studies evaluating the effects of AR, VR, MIX and GAME on cognitive and affective variables
Scientific studies evaluating the impact of applications other than AR, VR, MIX and GAME
Accessibility
Scientific research articles with full text access
Scientific studies whose full text is not available
Data
Scientific research articles containing statistical data proper for meta-analysis
Scientific studies that do not contain statistical data proper for meta-analysis
A total of 53 scientific research articles were included in the study based on the inclusion/exclusion criteria specified in Table 2. The process of literature review and evaluation of the inclusion and exclusion criteria for the accessed studies is illustrated in the PRISMA flow diagram shown in Fig. 2 (Moher et al., 2009).
The 53 articles included in the study were coded by two researchers using a consensus-based approach. To identify any coding errors, researchers jointly reviewed randomly selected articles at the end of the coding process. The coding form utilized in the study included information on author names, experimental and control group (mean, SD, sample size), t-value (if applicable), F-value (if applicable), experimental and control group educational resources/environments (conventional, AR-supported, VR-supported, MIX-supported, and GAME-supported), experimental and control group comparison (e.g., AR-CON, AR-VR, GAME-CON), sub-subject area (e.g., physics, chemistry, biology), examined variable (cognitive or affective), learning level/grade level, and application duration. When coding experimental and control group educational resources/environments, the conceptual definitions provided in the introduction section were considered along with the definitions provided by the respective study authors.
In addition to the coding procedures described above, researchers made the following considerations during the coding process:
i)
In a study comparing different VR modalities (HMD-IVR) (Makransky & Lilleholt, 2018), the researchers determined that the IVR version could be considered within the scope of GAME. Consequently, this study was coded as “VR-GAME” in the experimental and control group comparison.
ii)
Groups that utilized technology tools/applications without interactive elements (e.g., multimedia, PowerPoint presentations, 2D animations) (Maulana, 2020; Yen et al., 2013) were considered to represent conventional teaching methods. In the experimental and control group comparisons for studies involving these groups, the Maulana (2020) and Lai et al. (2019) studies were coded as “AR-CON,” while the Yen et al. (2013) study was coded as “AR-VR.”
iii)
For most studies, independent results were coded for both cognitive and affective learning outcomes (e.g., Arici & Yilmaz, 2023; Kuznetcova et al., 2023).
iv)
In a study comparing an AR-supported digital game to a group using only AR (Wang et al., 2022), the AR-supported digital game group was found to possess interactive digital game features. Consequently, this study was coded as “AR-GAME” in the experimental and control group comparison.
v)
In cases where the same study reported multiple effect size values within the same cognitive outcome category (cognitive/affective), the effect size values for each category were combined to calculate a single “multiple effect size” value (e.g., Arici and Yilmaz, 2023; Yu et al., 2023).
vi)
In identifying studies defined as mixed reality, the validity of the “mixed reality” keyword in relevant primary research was initially examined, and cases where this environment was identified by the authors as mixed reality were considered. In addition, particular attention has been paid to whether the mixed reality environment possesses unique characteristics that distinguish it from AR and VR (e.g., Johnson-Glenberg et al., 2014), or whether AR and VR are used simultaneously in the experimental process (e.g., Weng et al., 2019), that is, whether there is the possibility for the user to perceive real and virtual content simultaneously, including through different senses (Skarbez et al., 2021). In this context, when evaluating mixed reality studies within the scope of interactive learning environments, the fundamental attributes that set them apart from AR and VR, as well as the integrative experience they offer, were taken into account.
Data Analysis
Hedges’ g (Hedges, 1981) was used as the effect size index. The average effect size was interpreted as “small” (0.15–0.35), “medium” (0.36–0.64), or “large” (0.65 and above) according to the Lovakov and Agadullina (2021) scale. A heterogeneity test was conducted to determine the presence and magnitude of variance. A statistically significant (p < 0.05) result of this test indicates the presence of heterogeneity. The magnitude of heterogeneity was determined by the I2 index. This index was interpreted as “low” (up to 25%), “moderate” (up to 50%), or “high” (up to 75%) heterogeneity according to the Higgins et al. (2003) scale. Accordingly, since the purpose of this study is to determine the comparative effectiveness and relative ranking of ILE formats on cognitive and affective learning outcomes in science education, the effect size was expressed as the standardized mean difference. To determine homogeneity among the studies, a heterogeneity test was conducted using the Q statistic and the I2 value. A significant Q statistic rejected the null hypothesis of homogeneity, indicating heterogeneity (Lipsey & Wilson, 2001). Additionally, since the studies were collected from the literature, the random effects model was deemed more appropriate (Borenstein et al., 2013).
One of the key assumptions that affects the reliability of NMA is transitivity. Transitivity assumption, or “transitivity,” in the context of NMA refers to the assumption that the relative treatment effects estimated from direct and indirect comparisons are consistent (Rouse et al., 2017). In other words, it assumes that differences between compared interventions originate from the interventions themselves rather than from characteristics of the studies (e.g., study designs or participant characteristics). In this study, two methods have been used to control for inconsistency. The first method is the net-splitting technique, which examines the consistency between these two estimates each time by calculating direct and indirect effects for pairwise comparisons (Dias et al., 2010). The second method is the netheat graph proposed by Krahn et al. (2013). The netheat graph is created by temporarily removing each design one by one, thus evaluating the contribution of each design to the inconsistency of the entire network (Freeman et al., 2019). To assess whether publication bias had an impact on the calculated effect sizes, a comparison-adjusted funnel plot was examined. If there is no publication bias, this plot is expected to show a symmetrical distribution. However, since the interpretation of this plot is quite subjective (Borenstein et al., 2013, p.273), Egger’s regression test (Egger et al., 1997) and Begg-Mazumdar test (Begg & Mazumdar, 1994) were conducted to more definitively determine whether the distribution is symmetrical. Non-significance of the test at the 95% level indicates that the distribution is symmetrical and therefore that there is no publication bias in the study. The analyses were conducted using the netmata package (Balduzzi et al., 2023) in the R program (R Core Team, 2021).
Results
The Effect of ILE on Cognitive Outcomes
To determine the effect of ILEs on cognitive outcomes, 45 studies were collected from the literature and combined using a random effects model. Table 3 presents the results of all pairwise comparisons between the ILEs.
Table 3
Pairwise comparisons of the ILE formats
Format comparison
k
SMD (95% CI)
Heterogeneity
I2 (%)
Egger’ p value
χ2 (df)
p
AR vs
CON
19
0.882 [0.473, 1.291]
149.22 (18)
< 0.001
87.9
0.006*
GAME
4
0.726 [− 0.410, 1.862]
105.06 (3)
< 0.001
97.1
0.016*
VR
2
− 0.076 [− 0.498, 0.346]
1.97 (1)
0.160
49.3
NC
GAME vs
CON
11
0.674 [0.228, 1.120]
91.32 (10)
< 0.001
89.1
0.388
VR
1
0.145 [− 0.524, 0.814]
NC
NC
NC
NC
VR vs
CON
5
0.238 [− 0.290, 0.766]
20.50 (4)
0.0004
80.5
0.815
MIX vs
CON
3
0.966 [0.094, 1.837]
17.59
0.0002
88.6
0.844
As shown in Table 3, AR is a more effective format than CON, but there is no significant difference in effect size when compared to GAME and VR (p > 0.05). Similarly, GAME is a more effective format than CON, but there is no significant difference in effect size when compared to VR (p > 0.05). There is no significant difference between VR and CON (p > 0.05), while there is a significant difference between MIX and CON (p < 0.05). The heterogeneity test results was significant for all comparisons. The magnitude of heterogeneity was “high” for all comparisons. This indicates that the variance is due to the characteristics of the studies themselves, rather than sampling error. In other words, the studies have different characteristics. This also led to a significant result for Egger’s regression test for the effect sizes between AR and CON and AR and GAME (p < 0.05). This suggests that there may be publication bias for these two comparisons. Figure 3 shows the network graph of the direct effects between the pairwise comparisons.
The thickness of the edge connecting two nodes in the network graph indicates the number of comparisons between those two nodes. A thicker edge indicates more comparisons. Similarly, the size of a node indicates the number of studies that have conducted comparisons involving that node. A larger node indicates more comparisons. According to the network graph, the most comparisons were made between CON and AR (k = 20). The most compared format was CON (k = 38). As shown in Fig. 3, all interventions (AR, VR, MIX, and GAME) have been directly compared with the conventional method. Additionally, direct comparisons have been made between AR and VR, AR and GAME, and GAME and VR. Due to the absence of direct comparisons between MIX and other interventions (AR, VR, and GAME) in this network, indirect comparisons have been conducted.
The indirect comparison results obtained from the NMA are presented in the upper diagonal of Fig. 3. Accordingly, the indirect comparison of MIX with AR, VR, and GAME have yielded a non-significant effect size (p > 0.05). In other words, the effect of MIX on cognitive outcomes does not show a significant difference when compared to AR, VR, and GAME. Another point to note here is that the results obtained from the NMA should be consistent with the results obtained from the direct comparisons. When Fig. 3 is examined, it can be seen that the direct comparison results on the lower diagonal are consistent with the NMA results on the upper diagonal.
The upper diagonal displays the standardized mean difference (SMD) for the column intervention vs the row intervention, derived from the NMA (Fig. 4). Values > 0 favor the column defining intervention. The lower diagonal displays the standardized mean differences for the row intervention vs the column intervention, derived from direct comparisons only. Values > 0 favor the row defining the intervention.
To determine the most effective format, a ranking of interventions analysis was conducted. The P-scores calculated from this analysis are presented in Table 4.
Table 4
Ranking of ILE formats by P-scores
ILE formats
P-scores
AR
0.826
MIX
0.789
GAME
0.489
VR
0.369
CON
0.027
The intervention with the highest P-score is the most effective intervention on cognitive outcomes. Accordingly, the most effective ILE format is AR, followed by MIX, GAME, VR, and CON, respectively.
Evaluating the Validity of the Results
To assess the consistency and, therefore, the validity of the obtained results, the netheat plot was examined first, in accordance with the local approach. This graph was generated based on the random effects model.
In the netheat plot, the colors of the areas can range from dark red, indicating strong inconsistency, to blue, indicating consistency among the results (Harrer et al., 2021). As observed in the graph, the absence of red generally indicates greater consistency among the results (Fig. 5). However, some inconsistencies regarding the effects of AR and VR can still be noted. The second method for assessing the validity of the results is the net-splitting technique. This technique separates network predictions into direct and indirect effects, determining the significance level of the difference between them. The lack of significance indicates consistency among the results. The results for the net-splitting technique are presented in Table 5.
As seen in Table 5, all detected differences are not statistically significant at the 95% CI level (p > 0.05). Therefore, it can be concluded that the obtained results are consistent with each other. To determine the extent to which publication bias may have affected the network model, comparison-adjusted funnel plots were examined.
As seen in the funnel plot, the comparisons are asymmetrically distributed around zero effect size (Fig. 6). Bias tests (Egger and Begg-Mazundar) conducted to determine the significance of this asymmetry revealed that it was not significant (p > 0.05). These results indicate that there is no evidence of a small-study effect in the network model.
A total of 25 studies were gathered from the literature to explore the impact of ILE on affective outcomes. The synthesis of these studies was conducted using a random-effects model. Table 6 displays the outcomes of all pairwise comparisons among the ILEs.
Table 6
Pairwise comparisons of the ILE formats
Format comparison
k
SMD (95% CI)
Heterogeneity
I2 (%)
Egger’ p value
χ2 (df)
p
AR vs
CON
15
1.104 [0.589, 1.618]
139.99 (14)
< 0.001
90
0.012*
GAME
4
0.220 [−0.505, 1.945]
11.44 (2)
0.003
82.5
0.619
VR
1
0.629 [0.244, 1.013]
NC
NC
NC
NC
GAME vs
CON
6
1.066 [−0.095, 2.226]
91 (5)
< 0.001
94.5
0.092
VR
1
0.220 [−0.327, 0.768]
NC
NC
NC
NC
VR vs
CON
2
0.584 [0.217, 0.952]
1.17 (1)
0.280
14.4
NC
MIX vs
CON
1
0.696 [0.234, 1.149]
NC
NC
NC
NC
As seen in Table 6, while AR is a more effective format compared to CON, there is no significant difference in effect size when compared to GAME (p > 0.05). A study comparing the effects of AR and VR on affective outcomes is available in the literature. According to the findings of this study, AR has a higher effect size compared to VR. When comparing the effects of GAME with CON and VR on affective outcomes, no significant difference was found between them (p > 0.05).
The heterogeneity test results for the comparisons between AR and CON, AR and GAME, and GAME and CON showed high and significant levels (p < 0.05). However, the comparison between VR and CON was not significant (p > 0.05). This indicates that the variance in the three significant comparisons arises from characteristics of the studies rather than sampling error. In contrast, the variance in the comparison between VR and CON is attributed to sampling error. The Egger test, conducted to determine if the calculated mean effect sizes are a product of publication bias, revealed publication bias in the comparison between AR and CON, while no publication bias was found in the comparisons between AR and GAME and GAME and CON. The network graph depicting direct effects between pairwise comparisons is provided in Fig. 7.
According to the network graph, the most comparisons were made between CON and AR (k = 15). The most compared format was CON (k = 24). As shown in Fig. 7, all interventions (AR, VR, MIX, and GAME) have been directly compared with the conventional method. Additionally, direct comparisons have been made between AR and VR, AR and GAME, and GAME and VR. According to this network, since there are no direct comparisons between MIX and the other interventions (AR, VR, and GAME), indirect comparisons have been conducted.
The indirect comparison results obtained from the NMA are presented in the upper diagonal of Fig. 8. Accordingly, the indirect comparisons of MIX with AR, VR, and GAME have yielded non-significant effect sizes (p > 0.05). In other words, the effect of MIX on affective outcomes does not show a significant difference when compared to AR, VR, and GAME. When Fig. 8 is examined, it can be seen that the direct comparison results on the lower diagonal are consistent with the NMA results in the upper diagonal.
Just like in Fig. 4, the upper diagonal displays the standardized mean difference (SMD) for the column intervention vs the row intervention, derived from the NMA. Values > 0 favor the column defining intervention. The lower diagonal displays the standardized mean differences for the row intervention vs the column intervention, derived from direct comparisons only. Values > 0 favor the row defining the intervention.
To determine the most effective format, a ranking of interventions analysis was conducted. The P-scores calculated from this analysis are presented in Table 7.
Table 7
Ranking of ILE formats by P-scores
ILE formats
P-scores
AR
0.819
GAME
0.651
MIX
0.496
VR
0.469
CON
0.065
The intervention with the highest P-score is the most effective intervention on affective outcomes. Accordingly, the most effective ILE format is AR, followed by digital games, MIX, VR, and CON, respectively.
Evaluating the Validity of the Results
To assess the consistency and, therefore, the validity of the obtained results, the netheat plot was examined first, in accordance with the local approach. This graph was generated based on the random effects model.
As observed in the graph, the absence of red generally indicates greater consistency among the results (Fig. 9). However, some inconsistencies regarding the effects of AR and GAME can still be noted. The second method for assessing the validity of the results is the net-splitting technique. This technique separates network predictions into direct and indirect effects, determining the significance level of the difference between them. The lack of significance indicates consistency among the results. The results for the net-splitting technique are presented in Table 8.
As seen in Table 8, all detected differences are not statistically significant at the 95% CI level (p > 0.05). Therefore, it can be concluded that the obtained results are consistent with each other. To determine the extent to which publication bias may have affected the network model, comparison-adjusted funnel plots were examined.
As seen in the funnel plot, the comparisons are asymmetrically distributed around zero effect size (Fig. 10). Bias tests (Egger and Begg-Mazundar) conducted to determine the significance of this asymmetry revealed that it was not significant (p > 0.05). These results indicate that there is no evidence of a small-study effect in the network model.
The objective of this study was to identify the most effective interactive learning format (AR, VR, MIX, and GAME) by comparing their average effect sizes against both conventional learning (CON) and each other. To achieve this, a single-variable meta-analysis was initially conducted using effect sizes collected from the literature.
The results of this meta-analysis revealed that AR exhibits a larger effect size on both cognitive (d = 0.882) and affective (d = 1.104) outcomes compared to CON. This finding aligns with the results of previous meta-analyses investigating the impact of AR use in science education on student achievement (Kalemkus & Kalemkus, 2023; Xu et al., 2022). However, these studies solely report on the effect of AR use in science education on academic achievement compared to conventional methods. Consequently, no findings have been reported on its impact on other cognitive and affective variables. Additionally, the moderator analysis conducted by Xu et al. (2022) indicated that the results varied significantly based on the subject areas within science education and the type of AR used in the studies (marker-based, markerless-based, and location-based). It was found that markerless AR, which does not require physical markers to trigger AR content, provides more flexibility and hence better integration into learning environments, making it more effective at enhancing learning outcomes. Furthermore, Xu et al. (2022) indicated that integrating AR with instructional strategies like problem-based learning significantly improved student engagement and learning outcomes. For example, combining AR with self-directed learning or game-based learning was shown to improve both motivation and achievement. In this context, it is plausible that the calculated difference in effect size between cognitive and affective variables in our study could stem from the potential of the respective moderators to differentially influence cognitive and affective variables.
AR was found to exhibit a larger effect size on affective outcomes compared to VR. No meta-analysis study comparing the two applications in this domain was identified. However, a primary study by Huang et al. (2019) concluded that VR was generally more effective than AR in terms of learning outcomes. While conflicting results may exist in the literature, the overall effect size favoring AR in this study could be attributed to the user’s dominance of the real world and interaction with virtual objects in AR (Azuma, 1997), as opposed to the user’s complete immersion in a virtual environment and lack of control over the physical environment in VR. Additionally, the overall effect size could align with the technology acceptance model (Davis, 1989) by suggesting that AR’s perceived ease of use compared to VR may contribute to the observed results (Dalim et al., 2017; Elfitra Mansyur & Taufik, 2021). Furthermore, the nature of the learning content was also found to potentially influence learning outcomes differently between the two applications. When the presented content requires visual emphasis, VR appears to be more effective (compared with AR), while AR seems to be more effective when visual elements are predominant in the content (Huang et al., 2019). However, no significant difference in effect sizes was observed between AR and VR on cognitive outcomes (p > 0.05). This finding aligns with the results of Huang et al. (2019) and Yen et al. (2013), where statistical significance was generally achieved with small differences and no significant differences were found for certain variables (e.g., learning achievement). Similarly, no significant difference (p > 0.05) was found between AR and GAME in terms of their effects on cognitive and affective outcomes. Upon reviewing the literature, no meta-analysis study was identified that evaluated the impact of the two technologies on learning outcomes in the field of science education. However, examination of the findings of the primary studies included in the analysis revealed that both applications significantly enhanced learning outcomes in the instructional process but did not exhibit a significant difference in learning outcomes compared to each other (Chen, 2020; Hsiao et al., 2012; Wang, 2020).
GAME exhibited a higher average effect size on cognitive outcomes (d = 0.674) compared to conventional methods, while no significant difference was observed between the two on affective outcomes (p > 0.05). The finding regarding cognitive outcomes aligns with the meta-analysis study conducted by Wang et al. (2022). Moderator analyses conducted in the studies by Lin et al. (2024) and Wang et al. (2022) indicate that the effectiveness of digital games is influenced by factors such as game type, platform, and educational level. For instance, Lin et al. (2024) found that game designs based on embodied cognition, where learning occurs through bodily interaction, are particularly effective for skill-based learning outcomes, especially when students actively participate in the game rather than passively observing. Wang et al. (2022) found that games focusing on active learning strategies, such as those requiring students to solve problems or work collaboratively in teams, are particularly effective. Therefore, interactive digital games should be designed to incorporate elements such as problem-solving, collaboration, and immediate feedback to foster motivation and enhance learning outcomes. Additionally, a primary study by Hussein et al. (2019) concluded that GAME significantly enhanced critical thinking skills compared to conventional teaching methods but did not establish a significant difference in motivation and self-efficacy. This finding was attributed to the focus of the GAME used in the study on cognitive outcomes primarily and the limited freedom-promoting environment of the classroom potentially restricting the development of affective variables. Furthermore, upon examining the learning outcomes addressed in the studies meeting the inclusion criteria of our research, it was observed that affective variables were less frequently assessed. This could potentially introduce a bias in favor of cognitive outcomes. Similarly, no significant difference (p > 0.05) was found between GAME and VR in terms of their effects on cognitive and affective outcomes. No meta-analysis study comparing the two conditions in the field of science education was identified in the literature. Among the primary studies included in the research, a limited number of studies (e.g., Lamb et al., 2018; Makransky & Lilleholt, 2018) compare the two conditions. Upon examining these studies, it was observed that desktop VR applications were compared to VR headset-based versions with GAME features (Lamb et al., 2018; Makransky & Lilleholt, 2018). In these studies, it was also concluded that VR and GAME did not exhibit a significant difference from each other.
VR exhibited no significant difference (p > 0.05) in effect size on cognitive outcomes compared to conventional methods but demonstrated a significant difference (d = 0.584, p < 0.05) in favor of VR on affective outcomes. This finding is consistent with previous studies suggesting that VR’s effectiveness is highly influenced by the level of immersion and the type of learning content involved. To enhance the effectiveness of VR, it is crucial to consider the immersion level and design activities that encourage students to actively interact with the virtual environment. For example, Coban et al. (2022) highlighted that higher levels of immersion through IVR contribute to better knowledge retention and skill development. Wu et al. (2020) reported that while immersive VR using HMDs provides a sense of immersion that can be beneficial for spatial understanding and skill development, it may not necessarily translate into improved academic achievement when compared to non-immersive or conventional methods, especially in purely cognitive measures.
MIX exhibited a higher effect size on both cognitive (d = 0.882) and affective (d = 1.104) outcomes compared to conventional methods. A limited number of studies have examined the effect of MIX in science education. The primary studies included in the analysis indicated that MIX has a positive and significant effect on learning outcomes compared to conventional methods (Johnson-Glenberg et al., 2014; Weng et al., 2019). This expected finding aligns with the notion that when instructional technologies are appropriately integrated into the teaching process, they can positively impact learning outcomes compared to conventional teaching methods (Kordaki, 2010).
As observed in the network graphs constructed for both cognitive and affective outcomes, all ILE formats were directly compared to conventional methods, and direct comparisons were also made between augmented reality and VR, AR and GAME, and GAME and VR. However, according to these network graphs, since there was no direct comparison between MIX and the other ILE formats (AR, VR, and GAME), an indirect comparison was made using network meta-analysis. Accordingly, no significant difference (p > 0.05) was found between the effect sizes of MIX on cognitive and affective outcomes compared to the other ILE formats. Primarily, there is a need for primary studies comparing MIX technology with other ILE formats to reach more concrete conclusions. Additionally, it is considered that the result obtained through indirect comparisons indicates that MR may also contribute to learning outcomes equivalent to other ILE formats. This is because MIX, as can be understood from the definition of “a real environment that allows simultaneous interaction with virtual experiences (Milgram & Kishino, 1994),” can also incorporate AR and VR simultaneously.
Generally, the findings from our network meta-analysis, coupled with insights from previous meta-analyses, demonstrate that different ILE formats (AR, VR, MIX, GAME) have unique strengths that can significantly enhance learning outcomes in science education. As a result of the network analysis, a ranking of interventions was performed to determine the most effective format on cognitive and affective outcomes. P-scores were used for relative comparison. Accordingly, AR was identified as the most effective format for both cognitive and affective outcomes. One of the reasons why AR emerged as the most effective ILE could be the decrease in the cost of AR due to the advancement of mobile phones and the consequent widespread adoption of AR in science education (Arici & Yilmaz, 2023). Therefore, a bias might have occurred in this direction since the effect of AR on learning outcomes in science education has been tested in more studies compared to other ILE formats. It is considered a limitation of the study that VR and MR, or applications like GAME based on their type, are mostly virtual and lack real-world dominance, whereas AR allows for real-world dominance (Azuma, 1997) and can be expensive due to the requirement for powerful processors (Fernandez, 2017). This situation might have favored AR, which is relatively more accessible and applicable. Moreover AR appears to be most effective for visualizing complex concepts, VR for providing immersive experiences in abstract topics, and digital games for fostering engagement through active learning. Another factor that could have contributed to this outcome is perceived ease of use, which is also consistent with the technology acceptance model (TAM) (Davis, 1989). According to this model, users exhibit positive attitudes towards using new technologies when they believe that their use will be easy and beneficial (Chang et al., 2011). After AR, MIX was identified as the most effective format for cognitive outcomes, while GAME was identified as the most effective format for affective outcomes. This finding could be related to which cognitive outcomes are primarily focused on in the instructional process. While cognitive outcomes may be more easily influenced positively in a classroom setting, affective variables seem to be positively affected when individuals feel more free (Hussein et al., 2019).
Limitation and Suggestion
This study has a few limitations that should be considered when interpreting the results. Firstly, no study was found in the literature that compares MIX with other ILE formats in terms of its effect on cognitive and affective outcomes. Therefore, a comparison was made through indirect effects obtained from network meta-analysis. The analysis found no significant difference between MIX and other ILE formats. However, the reliability of this effect could not be determined due to the lack of direct comparisons in the literature. Similarly, a comparison was reached between AR and VR on affective outcomes and between MIX and CON on cognitive outcomes. A significant limitation of this study is the limited availability of direct comparisons between certain ILE formats, specifically between AR and VR on affective outcomes and between MIX and CON on cognitive outcomes. This gap in the literature restricts the reliability of the conclusions regarding the relative effectiveness of these formats. To address this limitation, future research should focus on conducting more primary studies that directly compare these specific ILE formats. Such studies would provide stronger evidence, allowing for a clearer understanding of which approaches are more effective for enhancing different learning outcomes. Additionally, only one comparison was found between GAME and VR on both cognitive and affective outcomes. More primary studies are needed to determine which application is more effective among these variables. Moreover, as evidenced by previous meta-analyses (Coban et al., 2022; Kalemkus & Kalemkus, 2023; Lin et al., 2024; Wang et al., 2022; Wu et al., 2020; Xu et al., 2022), the effective implementation of these technologies requires the appropriate integration of instructional strategies tailored to specific learning objectives and student needs. In this context, future research should continue to explore how combining these technologies with different instructional approaches can yield the best learning outcomes. Additionally, educators and researchers must consider students’ specific needs and learning goals when designing interventions with interactive learning environments (ILEs) and ensure that these technologies are utilized to their full potential.
Secondly, a high level of heterogeneity was observed in most of the comparisons made for both cognitive and affective outcomes (except for the VR vs. CON comparison on affective outcomes). This result indicates that the variance originates from the characteristics of the studies themselves rather than sampling error. Indeed, when considering the table (Appendix) where study characteristics were coded, it is evident that the studies differ in many components. These differences might have contributed to the high heterogeneity. High heterogeneity can lead to wide confidence intervals for the estimates, making the estimates less precise. To obtain more precise results, more studies need to be included in the meta-analysis, and the effect of study characteristics on the estimates needs to be examined. This could not be investigated in this network meta-analysis due to the insufficient number of common categories among studies. However, no significant inconsistency was found for the results obtained through the tests performed (e.g., heat plot, net-splitting).
Thirdly, the effect of publication bias was found to be significant for some pairwise comparisons (e.g., AR vs. CON, AR vs. GAME). These results indicate that the studies published for these comparisons do not represent all the existing studies, and therefore, their results may not accurately reflect the true effect. More studies are needed for these comparisons to obtain more reliable results. While the effect of publication bias was found to be significant for some pairwise comparisons, the asymmetric distributions in the comparison-adjusted funnel plots examined for the network meta-analysis were not found to be significant. This suggests that the small study effect is not significant for the network results and that the results are reliable.
Declarations
Ethical Approval
Since no data was obtained from humans or animals during the research process, ethics committee approval was not received. The authors adhered to ethical rules throughout the research.
Competing Interests
The authors declare no competing interests.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Akcayir, M., & Akcayir, G. (2017). Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educational Research Review,20, 1–11. https://doi.org/10.1016/j.edurev.2016.11.002CrossRef
Akcayir, M., Akcayir, G., Pektas, H. M., & Ocak, M. A. (2016). Augmented reality in science laboratories: The effects of augmented reality on university students’ laboratory skills and attitudes toward science laboratories. Computers in Human Behavior,57, 334–342. https://doi.org/10.1016/j.chb.2015.12.054CrossRef
Al Amri, A. Y., Osman, M. E., & Al Musawi, A. S. (2020). The effectiveness of a 3D-virtual reality learning environment (3D-VRLE) on the Omani eighth grade students’ achievement and motivation towards physics learning. International Journal of Emerging Technologies in Learning (Online),15(5), 4. https://doi.org/10.3991/ijet.v15i05.11890CrossRef
Al-Mashhadani, M. A., & Al-Rawe, M. F. (2018). The future role of mobile learning and smartphones applications in the Iraqi private universities. Smart Learning Environments,5(28), 1–11. https://doi.org/10.1186/s40561-018-0077-7CrossRef
Anderson, J. L., & Barnett, M. (2013). Learning physics with digital game simulations in middle school science. Journal of Science Education and Technology,22, 914–926. https://doi.org/10.1007/s10956-013-9438-8CrossRef
Annetta, L. A., Minogue, J., Holmes, S. Y., & Cheng, M.-T. (2009). Investigating the impact of video games on high school students’ engagement and learning about genetics. Computers & Education,53(1), 74–85. https://doi.org/10.1016/j.compedu.2008.12.020CrossRef
Arici, F., & Yilmaz, M. (2023). An examination of the effectiveness of problem-based learning method supported by augmented reality in science education. Journal of Computer Assisted Learning,39(2), 446–476. https://doi.org/10.1111/jcal.12752CrossRef
Atmojo, I. R. W., Ardiansyah, R., Saputri, D. Y., & Adi, F. P. (2021). The effectiveness of STEAM-based augmented reality media in improving the quality of natural science learning in elementary school. Al-Ishlah: Jurnal Pendidikan, 13(2), 821–828. https://doi.org/10.35445/alishlah.v13i2.643
August, S. E., Hammers, M. L., Murphy, D. B., Neyer, A., Gueye, P., & Thames, R. Q. (2016). Virtual engineering sciences learning lab: Giving STEM education a second life. IEEE Transactions on Learning Technologies, 9, 18–30. https://doi.org/10.1109/TLT.2015.2419253
Baceviciute, S., Terkildsen, T., & Makransky, G. (2021). Remediating learning from non-immersive to immersive media: Using EEG to investigate the effects of environmental embeddedness on reading in Virtual Reality. Computers & Education, 164, 104122. https://doi.org/10.1016/j.compedu.2020.104122
Baker, R., Walonoski, J., Heffernan, N., Roll, I., Corbett, A., & Koedinger, K. (2008). Why students engage in“gaming thesystem”behavior in interactive learning environments. Journal of Interactive Learning Research,19(2), 185–224.
Balduzzi, S., Rücker, G., Nikolakopoulou, A., Papakonstantinou, T., Salanti, G., Efthimiou, O., & Schwarzer, G. (2023). netmeta: An R package for network meta-analysis using frequentist methods. Journal of Statistical Software,106, 1–40. https://doi.org/10.18637/jss.v106.i02CrossRef
Begg, C. B., & Mazumdar, M. (1994). Operating characteristics of a rank correlation test for publication bias. Biometrics,50(4), 1088–1101.CrossRef
Burdea, G. C., & Coiffet, P. (2003). Virtual reality technology. John Wiley & Sons.
Calvert, J., & Abadia, R. (2020). Impact of immersing university and high school students in educational linear narratives using virtual reality technology. Computers & Education, 159. https://doi.org/10.1016/j.compedu.2020.104005
Câmara Olim, S., Nisi, V., & Romão, T. (2024). Periodic fable discovery: An augmented reality serious game to introduce and motivate young children towards chemistry. Multimedia Tools and Applications, 83(17), 52593–52619. https://doi.org/10.1007/s11042-023-17526-9
Çetin, H., & Türkan, A. (2022). The effect of augmented reality based applications on achievement and attitude towards science course in distance education process. Education and Information Technologies, 27(2), 1397–1415. https://doi.org/10.1007/s10639-021-10625-w
Chang, Y. J., Chen, C. H., Huang, W. T., & Huang, W. S. (2011, July). Investigating students’ perceived satisfaction, behavioral intention, and effectiveness of English learning using augmented reality. In 2011 IEEE international conference on multimedia and Expo (pp. 1–6). IEEE.
Chen, C. H. (2020). Impacts of augmented reality and a digital game on students’ science learning with reflection prompts in multimedia learning. Educational Technology Research and Development,68(6), 3057–3076. https://doi.org/10.1007/s11423-020-09834-wCrossRef
Chen, Y. L., & Hsu, C. C. (2020). Self-regulated mobile game-based English learning in a virtual reality environment. Computers & Education, 154. https://doi.org/10.1016/j.compedu.2020.103910
Cheng, K. H., & Tsai, C. C. (2020). Students’ motivational beliefs and strategies, perceived immersion and attitudes towards science learning with immersive virtual reality: A partial least squares analysis. British Journal of Educational Technology,51(6), 2140–2159. https://doi.org/10.1111/bjet.12956CrossRef
Cheok, A. D., Haller, M., Fernando, O. N. N., & Wijesena, J. P. (2009). Mixed reality entertainment and art. The International Journal of Virtual Reality,8, 83–90. https://doi.org/10.20870/IJVR.2009.8.2.2729CrossRef
Chiang, T. H., Yang, S. J., & Hwang, G. J. (2014). An augmented reality-based mobile learning system to improve students’ learning achievements and motivations in natural science inquiry activities. Journal of Educational Technology & Society, 17(4), 352–365. https://www.jstor.org/stable/10.2307/jeductechsoci.17.4.352
Cipresso, P., Giglioli, I. A. C., Raya, M. A., & Riva, G. (2018). The past, present, and future of virtual and augmented reality research: A network and cluster analysis of the literature. Frontiers in Psychology, 9. https://doi.org/10.3389/fpsyg.2018.02086
Clark, D. B., Tanner-Smith, E. E., & Killingsworth, S. S. (2016). Digital games, design, and learning: A systematic review and meta-analysis. Review of Educational Research,86(1), 79–122. https://doi.org/10.3102/0034654315582065CrossRef
Coban, M., Bolat, Y. I., & Goksu, I. (2022). The potential of immersive virtual reality to enhance learning: A meta-analysis. Educational Research Review,36, 100452. https://doi.org/10.1016/j.edurev.2022.100452CrossRef
Costa, A., Lima, R., & Tamayo, S. (2019). Eva: A virtual pet in augmented reality. 2019 21st Symposium on Virtual and Augmented Reality (SVR), pp. 47–51. https://doi.org/10.1109/SVR.2019.00024
Cuijpers, P., Noma, H., Karyotaki, E., Cipriani, A., & Furukawa, T. A. (2019). Effectiveness and acceptability of cognitive behavior therapy delivery formats in adults with depression: A network meta-analysis. JAMA Psychiatry,76(7), 700–707. https://doi.org/10.1001/jamapsychiatry.2019.0268CrossRef
Czok, V., Krug, M., Müller, S., Huwer, J., & Weitzel, H. (2023). Learning effects of augmented reality and game-based learning for science teaching in higher education in the context of education for sustainable development. Sustainability, 15(21), 15313. https://doi.org/10.3390/su152115313
Dalim, C. S. C., Kolivand, H., Kadhim, H., Sunar, M. S., & Billinghurst, M. (2017). Factors influencing the acceptance of augmented reality in education: A review of the literature. Journal of Computer Science,13(11), 581–589. https://doi.org/10.3844/jcssp.2017.581.589CrossRef
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS quarterly,13, 319–340.CrossRef
Dias, S., Welton, N. J., Caldwell, D. M., & Ades, A. E. (2010). Checking consistency in mixed treatment comparison meta-analysis. Statistics in Medicine,29(7–8), 932–944. https://doi.org/10.1002/sim.3767CrossRef
Dilmen, I., & Atalay, N. (2021). The effect of the augmented reality applications in science class on students' 21st century skills and basic skills. Journal of Science Learning, 4(4), 337–346. https://doi.org/10.17509/jsl.v4i4.32900
Dunleavy, M., & Dede, C. (2014). Augmented reality teaching and learning. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (pp. 735–745). Springer. https://doi.org/10.1007/978-1-4614-3185-5_59
Efe, H., & Umdu Topsakal, Ü. (2022). A meta-synthesis study in interactive learning environments: Digital games in health education. Interactive Learning Environments, 1–11. https://doi.org/10.1080/10494820.2022.2120016
Egger, M., Davey Smith, G., Schneider, M., & Minder, C. (1997). Egger M, Smith GD, Schneider M, Minder CBias in meta-analysis detected by a simple, graphical test. BMJ 315: 629–634. BMJ (Clinical Research Ed.),315, 629–634. https://doi.org/10.1136/bmj.315.7109.629CrossRef
Elfitra Mansyur, A., & Taufik, M. I. (2021). Student perceptions of augmented reality (AR) media in calculus courses. Journal of Physics: Conference Series,1819(1), 012033. https://doi.org/10.1088/1742-6596/1819/1/012033CrossRef
Elford, D., Lancaster, S. J., & Jones, G. A. (2022). Exploring the effect of augmented reality on cognitive load, attitude, spatial ability, and stereochemical perception. Journal of Science Education and Technology, 31(3), 322–339. https://doi.org/10.1007/s10956-022-09957-0
Ferdinand, J., Soller, S., Hahn, J. U., Parong, J., & Göllner, R. (2023). Enhancing the effectiveness of virtual reality in science education through an experimental intervention involving students’ perceived usefulness of virtual reality. Technology, Mind, and Behavior 4(1). https://doi.org/10.1037/tmb0000084
Fernandez, M. (2017). Augmented virtual reality: How to improve education systems. Higher Learning Research Communications,7(1), 1–15. https://doi.org/10.1037/tmb0000084CrossRef
Fidan, M., & Tuncel, M. (2019). Integrating augmented reality into problem based learning: The effects on learning achievement and attitude in physics education. Computers & Education, 142, 103635. https://doi.org/10.1016/j.compedu.2019.103635
Freeman, S. C., Fisher, D., White, I. R., Auperin, A., & Carpenter, J. R. (2019). Identifying inconsistency in network meta-analysis: Is the net heat plot a reliable method? Statistics in Medicine,38(29), 5547–5564. https://doi.org/10.1002/sim.8383CrossRef
Gopalan, V., Zulkifli, A. N., & Bakar, J. A. A. (2016). A study of students’ motivation using the augmented reality science textbook. In AIP conference proceedings (vol. 1761, no. 1). AIP Publishing. https://doi.org/10.1063/1.4960880
Guvenir, E., & Guven-Yıldırım, E. (2023). The effect of educational film supported augmented reality applications on academic achievement and motivation for science learning. Journal of Education in Science Environment and Health, 9(2), 119–130. https://doi.org/10.55549/jeseh.1279771
Gybas, V., Klubal, L., & Kostolányová, K. (2019). Using augmented reality for teaching students with mental disabilities. AIP Conference Proceedings,2116(1), 060015. https://doi.org/10.1063/1.5114050CrossRef
Harrer, M., Cuijpers, P., Furukawa, T., & Ebert, D. (2021). Doing meta-analysis with R (1st edition). Routledge.CrossRef
Haruna, H., Zainuddin, Z., Mellecker, R. R., Chu, S. K. W., & Hu, X. (2019). An iterative process for developing digital gamified-sexual health education for adolescent students in low-tech settings. Information and Learning Sciences,120(11), 723–742. https://doi.org/10.1108/ILS-07-2019-0066CrossRef
Hedges, L. V. (1981). Distribution theory for Glass’s estimator of effect size and related estimators. Journal of Educational Statistics,6(2), 107–128. https://doi.org/10.3102/10769986006002107CrossRef
Hiangsa, M., Srisawasdi, N., & Feungchen, W. (2015). The effect of pedagogy-embedded digital game in primary science education: A comparison of students’ understanding of vitamin. In The 23rd International Conference on Computers in Education (ICCE2015) (pp. 244–251).
Higgins, J. P. T., Thompson, S. G., Deeks, J. J., & Altman, D. G. (2003). Measuring inconsistency in meta-analyses. BMJ : British Medical Journal,327(7414), 557–560. https://doi.org/10.1136/bmj.327.7414.557CrossRef
Holz, T., Campbell, A. G., O’Hare, G. M. P., Stafford, J. W., Martin, A., & Dragone, M. (2011). MiRA–mixed reality agents. International Journal of Human-Computer Studies,69(4), 251–268. https://doi.org/10.1016/j.ijhcs.2010.10.001CrossRef
Hsiao, K. F., Chen, N. S., & Huang, S. Y. (2012). Learning while exercising for science education in augmented reality among adolescents. Interactive Learning Environments,20(4), 331–349. https://doi.org/10.1080/10494820.2010.486682CrossRef
Hsu, M., & Cheng, M. (2014). Bio Detective: Student science learning, immersion experience, and problem-solving patterns. In Proceedings of the 22nd International Conference on Computers in Education, Asia-Pacific Society for Computers in Education, 171–178. https://library.apsce.net/index.php/ICCE/article/view/3090/2966
Huang, K. T., Ball, C., Francis, J., Ratan, R., Boumis, J., & Fordham, J. (2019). Augmented versus virtual reality in education: An exploratory study examining science knowledge retention when using augmented reality/virtual reality mobile applications. Cyberpsychology, Behavior, and Social Networking, 22(2), 105–110. https://doi.org/10.1089/cyber.2018.0150
Hussein, M. H., Ow, S. H., Cheong, L. S., & Thong, M. K. (2019). A digital game-based learning method to improve students’ critical thinking skills in elementary science. IEEE Access,7, 96309–96318. https://doi.org/10.1109/ACCESS.2019.2929089CrossRef
Javornik, A. (2016). Augmented reality: Research agenda for studying the impact of its media characteristics on consumer behaviour. Journal of Retailing and Consumer Services,30, 252–261. https://doi.org/10.1016/j.jretconser.2016.02.004CrossRef
Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 horizon report. The New Media Consortium.
Johnson-Glenberg, M. C. (2018). Immersive VR and education: Embodied design principles that include mixed reality. Frontiers in Robotics and AI,5, 81. https://doi.org/10.3389/frobt.2018.00081CrossRef
Johnson-Glenberg, M. C., & Megowan-Romanowicz, C. (2017). Embodied science and mixed reality: How gesture and motion capture affect physics education. Cognitive Research: Principles and Implications,2(1), 1–28. https://doi.org/10.1186/s41235-017-0060-9CrossRef
Johnson-Glenberg, M. C., Birchfield, D., Koziupa, T., & Tolentino, L. (2014). Collaborative embodied learning in mixed reality motion-capture environments: Two science studies. Journal of Educational Psychology,106(1), 86–104. https://doi.org/10.1037/a0034008CrossRef
Kalemkus, J., & Kalemkus, F. (2023). Effect of the use of augmented reality applications on academic achievement of student in science education: Meta analysis review. Interactive Learning Environments,31(9), 6017–6034. https://doi.org/10.1080/10494820.2022.2027458CrossRef
Kaplan, A. D., Cruit, J., Endsley, M., Beers, S. M., Sawyer, B. D., & Hancock, P. A. (2021). The effects of virtual reality, augmented reality, and mixed reality as training enhancement methods: A meta-analysis. Human Factors, 63(4), 706–726. https://doi.org/10.1177/0018720820904229
Kerawalla, L., Luckin, R., Seljeflot, S. V., & Woolard, A. (2006). Making it real: Exploring the potential of augmented reality for teaching primary school science. Virtual Reality,10(3–4), 163–174. https://doi.org/10.1007/s10055-006-0036-4CrossRef
Koops, M. C., Verheul, I., Tiesma, R., de Boer, C. W., & Koeweiden, R. T. (2016). Learning differences between 3D vs. 2D entertainment and educational games. Simulation & Gaming,47(2), 159–178. https://doi.org/10.1177/1046878116632871CrossRef
Kordaki, M. (2010). A drawing and multi-representational computer environment for beginners’ learning of programming using C: Design and pilot formative evaluation. Computers & Education,54(1), 69–87. https://doi.org/10.1016/j.compedu.2009.07.012CrossRef
Krahn, U., Binder, H., & König, J. (2013). A graphical tool for locating inconsistency in network meta-analyses. BMC Medical Research Methodology,13, 35. https://doi.org/10.1186/1471-2288-13-35CrossRef
Kuznetcova, I., Glassman, M., Tilak, S., Wen, Z., Evans, M., Pelfrey, L., & Lin, T. J. (2023). Using a mobile virtual reality and computer game to improve visuospatial self-efficacy in middle school students. Computers & Education,192, 104660. https://doi.org/10.1016/j.compedu.2022.104660CrossRef
Lai, A. F., Chen, C. H., & Lee, G. Y. (2019). An augmented reality‐based learning approach to enhancing students’ science reading performances from the perspective of the cognitive load theory. British Journal of Educational Technology, 50(1), 232–247. https://doi.org/10.1111/bjet.12716
Lamb, R., Antonenko, P., Etopio, E., & Seccia, A. (2018). Comparison of virtual reality and hands on activities in science education via functional near infrared spectroscopy. Computers & education, 124, 14–26. https://doi.org/10.1016/j.compedu.2018.05.014
Lee, E. A. L., & Wong, K. W. (2014). Learning with desktop virtual reality: Low spatial ability learners are more positively affected. Computers & Education,79, 49–58. https://doi.org/10.1016/j.compedu.2014.07.010CrossRef
Lin, X., Li, R., Chen, Z., & Xiong, J. (2024). Design strategies for VR science and education games from an embodied cognition perspective: A literature-based meta-analysis. Frontiers in Psychology,14, 1292110. https://doi.org/10.3389/fpsyg.2023.1292110CrossRef
Lindgren, R., & Johnson-Glenberg, M. (2013). Emboldened by embodiment: Six precepts for research on embodied learning and mixed reality. Educational Researcher,42(8), 445–452. https://doi.org/10.3102/0013189X13511661CrossRef
Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. Sage Publications Inc.
Liu, R., Wang, L., Lei, J., Wang, Q., & Ren, Y. (2020). Effects of an immersive virtual reality-based classroom on students’ learning performance in science lessons. British Journal of Educational Technology,51(6), 2034–2049. https://doi.org/10.1111/bjet.13028CrossRef
Liu, S., Grey, B., & Gabriel, M. (2023, May). “Winning could mean success, yet losing doesn’t mean failure”—Using a mobile serious game to facilitate science learning in middle school. In Frontiers in Education, 8, 1164462. Frontiers. https://doi.org/10.3389/feduc.2023.1164462
Lovakov, A., & Agadullina, E. (2021). Empirically derived guidelines for effect size interpretation in social psychology. European Journal of Social Psychology,51, 485–504. https://doi.org/10.1002/ejsp.2752CrossRef
Maas, M. J., & Hughes, J. M. (2020). Virtual, augmented and mixed reality in K–12 education: A review of the literature. Technology, Pedagogy and Education,29(2), 231–249. https://doi.org/10.1080/1475939X.2020.1737210CrossRef
Makransky, G., & Lilleholt, L. (2018). A structural equation modeling investigation of the emotional value of immersive virtual reality in education. Educational Technology Research and Development,66, 1141–1164. https://doi.org/10.1007/s11423-018-9581-2CrossRef
Martin, S., Diaz, G., Sancristobal, E., Gil, R., Castro, M., & Peire, J. (2011). New technology trends in education: Seven years of forecasts and convergence. Computers & Education,57(3), 1893–1906. https://doi.org/10.1016/j.compedu.2011.04.003CrossRef
Matovu, H., Ungu, D. A. K., Won, M., Tsai, C. C., Treagust, D. F., Mocerino, M., & Tasker, R. (2023). Immersive virtual reality for science learning: Design, implementation, and evaluation. Studies in Science Education,59(2), 205–244. https://doi.org/10.1080/03057267.2022.2082680CrossRef
Maulana, I. (2020). The use of mobile-based augmented reality in science learning to ımprove learning motivation. Journal of Educational Technology and Online Learning,3(3), 363–371. https://doi.org/10.31681/jetol.670274CrossRef
Merchant, Z., Goetz, E. T., Cifuentes, L., Keeney-Kennicutt, W., & Davis, T. J. (2014). Effectiveness of virtual reality-based instruction on students’ learning outcomes in K-12 and higher education: A meta-analysis. Computers & Education,70, 29–40. https://doi.org/10.1016/j.compedu.2013.07.033CrossRef
Mikropoulos, T. A., & Natsis, A. (2011). Educational virtual environments: A ten-year review of empirical research (1999–2009). Computers & Education,56(3), 769–780. https://doi.org/10.1016/j.compedu.2010.10.020CrossRef
Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual display. IEICE Transaction on Information and Systems, E77-D, 73–87.
Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Annals of Internal Medicine,151(4), 264–269.CrossRef
Mystakidis, S., Besharat, J., Papantzikos, G., Christopoulos, A., Stylios, C., Agorgianitis, S., & Tselentis, D. (2022). Design, development, and evaluation of a virtual reality serious game for school fire preparedness training. Education Sciences,12(4), 281. https://doi.org/10.3390/educsci12040281CrossRef
Parong, J., & Mayer, R. E. (2018). Learning science in immersive virtual reality. Journal of Educational Psychology,110(6), 785–797. https://doi.org/10.1037/edu0000241CrossRef
Prensky, M. (2001). Digitalgame based leamiizg. McGraw Hill Press.
Prensky, M. (2012). From digital natives to digital wisdom: Hopeful essays for 21st century learning. Corwin.
Psotka, J. (2012). Interactive learning environments. In N. M. Seel (Ed.), Encyclopedia of the sciences of learning (pp. 1604–1606). Springer. https://doi.org/10.1007/978-1-4419-1428-6_321
R Core Team. (2021). R: A language and environment for statistical computing. R Foundation for Statistical Computing (4.1.2) [Computer software]. https://www.R-project.org/
Sahin, D., & Yilmaz, R. M. (2020). The effect of augmented reality technology on middle school students' achievements and attitudes towards science education. Computers & Education, 144, 103710. https://doi.org/10.1016/j.compedu.2019.103710
Sala, N. (2021). Virtual reality, augmented reality, and mixed reality in education: A brief overview. Current and Prospective Applications of Virtual Reality in Higher Education, 48–73. https://doi.org/10.4018/978-1-7998-4960-5.ch003
Sessoms, D. (2008). Interactive instruction: Creating interactive learning environments through tomorrow’s teachers. International Journal of Technology in Teaching and Learning,4(2), 86–96.
Setiawan, B., Rachmadtullah, R., Farid, D. A. M., Sugandi, E., & Iasha, V. (2023). Augmented reality as learning media: The effect on elementary school students’ science processability in terms of cognitive style. Journal of Higher Education Theory and Practice, 23(10).
Shin, D. H. (2017). The role of affordance in the experience of virtual reality learning: Technological and affective affordances in virtual reality. Telematics and Informatics,34(8), 1826–1836. https://doi.org/10.1016/j.tele.2017.05.013CrossRef
Sirakaya, M., & Alsancak-Sirakaya, D. (2018). The effect of augmented reality use in science education on attitude and motivation. Kastamonu Education Journal,26(3), 887–905. https://doi.org/10.24106/kefdergi.415705CrossRef
Skarbez, R., Smith, M., & Whitton, M. C. (2021). Revisiting Milgram and Kishino’s reality-virtuality continuum. Frontiers in Virtual Reality,2, 647997. https://doi.org/10.3389/frvir.2021.647997CrossRef
Sumadio, D. D., & Rambli, D. R. A. (2010). Preliminary evaluation on user acceptance of the augmented reality use for education. Proceedings of second international conference on computer engineering and applications (pp. 461–465). https://doi.org/10.1109/ICCEA.2010.239
Thees, M., Kapp, S., Strzys, M. P., Beil, F., Lukowicz, P., & Kuhn, J. (2020). Effects of augmented reality on learning and cognitive load in university physics laboratory courses. Computers in Human Behavior, 108, 106316. https://doi.org/10.1016/j.chb.2020.106316
Wang, L. H., Chen, B., Hwang, G. J., Guan, J. Q., & Wang, Y. Q. (2022). Effects of digital game-based STEM education on students’ learning achievement: A meta-analysis. International Journal of STEM Education,9(1), 1–13. https://doi.org/10.1186/s40594-022-00344-0CrossRef
Wang, Y. H. (2020). Integrating games, e-books and AR techniques to support project-based science learning. Educational Technology & Society, 23(3), 53–67. https://www.jstor.org/stable/26926426
Weng, C., Rathinasabapathi, A., Weng, A., & Zagita, C. (2019). Mixed reality in science education as a learning support: A revitalized science book. Journal of Educational Computing Research,57(3), 777–807. https://doi.org/10.1177/0735633118757017CrossRef
Widiyatmoko, A., Nugrahani, R., Yanitama, A., & Darmawan, M. S. (2023). The effect of virtual reality game based learning to enhance STEM literacy in energy concepts. Jurnal Pendidikan IPA Indonesia, 12(4). https://doi.org/10.15294/jpii.v12i4.48265
Wojciechowski, R., & Cellary, W. (2013). Evaluation of learners’ attitude toward learning in ARIES augmented reality environments. Computers & Education,68, 570–585. https://doi.org/10.1016/j.compedu.2013.02.014CrossRef
Wu, H.-K., Lee, S.W.-Y., Chang, H.-Y., & Liang, J.-C. (2013). Current status, opportunities and challenges of augmented reality in education. Computers Ve Education,62, 41–49. https://doi.org/10.1016/j.compedu.2012.10.024CrossRef
Wu, B., Yu, X., & Gu, X. (2020). Effectiveness of immersive virtual reality using head-mounted displays on learning performance: A meta-analysis. British Journal of Educational Technology,51(6), 1991–2005. https://doi.org/10.1111/bjet.13023CrossRef
Wu, J., Guo, R., Wang, Z., & Zeng, R. (2021). Integrating spherical video-based virtual reality into elementary school students’ scientific inquiry instruction: Effects on their problem-solving performance. Interactive Learning Environments,29(3), 496–509. https://doi.org/10.1080/10494820.2019.1587469CrossRef
Xu, W. W., Su, C. Y., Hu, Y., & Chen, C. H. (2022). Exploring the effectiveness and moderators of augmented reality on science learning: A meta-analysis. Journal of Science Education and Technology,31(5), 621–637. https://doi.org/10.1007/s10956-022-09982-zCrossRef
Yen, J.-C., Tsai, C.-H., & Wu, M. (2013). Augmented reality in the higher education: Students’ science concept learning and academic achievement in astronomy. Procedia - Social and Behavioral Sciences,103, 165–173. https://doi.org/10.1016/j.sbspro.2013.10.322CrossRef
Yildirim, F. S. (2020). The effect of the augmented reality applications in science class on students’ cognitive and affective learning. Journal of Education in Science Environment and Health,6(4), 259–267. https://doi.org/10.21891/jeseh.751023CrossRef
Yildirim, I., & Seckin-Kapucu, M. (2021). The effect of augmented reality applications in science education on academic achievement and retention of 6th grade students. Journal of Education in Science Environment and Health,7(1), 56–71. https://doi.org/10.21891/jeseh.744351CrossRef
Yu, S., Liu, Q., Ma, J., Le, H., & Ba, S. (2023). Applying augmented reality to enhance physics laboratory experience: Does learning anxiety matter?. Interactive Learning Environments, 31(10), 6952–6967. https://doi.org/10.1080/10494820.2022.2057547
Zeng, H., Zhou, S. N., Hong, G. R., Li, Q. Y., & Xu, S. Q. (2020). Evaluation of interactive game-based learning in physics domain. Journal of Baltic Science Education, 19(3), 484–498. https://doi.org/10.33225/jbse/20.19.484