Skip to main content
Top
Published in: Technology, Knowledge and Learning 1/2024

Open Access 07-07-2023 | Original research

Impact of an Emergency Remote Teaching Model on Students’ Academic Performance During COVID-19

Authors: Antonio Carrasco-Hernández, Gabriel Lozano-Reina, María Encarnación Lucas-Pérez, María Feliz Madrid-Garre, Gregorio Sánchez-Marín

Published in: Technology, Knowledge and Learning | Issue 1/2024

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The COVID-19 pandemic posed a major challenge to universities. It forced them to face the urgent need to rapidly transform their traditional onsite teaching into an emergency remote teaching (ERT) model rather than being able to gradually introduce an effective transition to an online model. Based on a sample of 505 students enrolled in the course on Work Organization at the University of Murcia in Spain, this study analyzes the impact of implementing an ERT model on students’ academic performance. Results show that students display superior academic performance in an onsite teaching–learning model compared to the online ones adopted during COVID-19. Findings also reveal that students’ self-assessment activities enhance their academic performance—both in onsite and online teaching contexts—which implies that ERT model performance can be alleviated by adequately planning self-assessment activities during the course.
Notes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

1 Introduction

The onset of the COVID-19 pandemic triggered a shockwave of unprecedented proportions in the twenty-first century that had a global impact (Reuge et al., 2021). Every area of society was affected, and people’s lives were turned upside down overnight. The field of higher education was no exception to this extremely complex situation. University lockdown measures meant the end of face-to-face teaching—“nine out of ten of the world’s students were excluded from face-to-face learning” (Heyneman, 2022, p. 2)—and the imposed introduction of online teaching methods (Gopal et al., 2021; Hosseini et al., 2021; Mali & Lim, 2021).
It was initially assumed that online teaching would be adopted normally by universities, considering the digital tools that were already available to them (Bosco-Paniagua & Rodríguez-Gómez, 2008; Castells, 2006; Laviña & Mengual, 2008). However, the pressing need to transform conventional face-to-face teaching to an online model prevented having the necessary planning required to do so correctly (Aldhahi et al., 2022; Selvaraj et al., 2021). In addition, the urgency involved in implementing the online model also failed to ensure that those affected would have access to the minimum technological means required, or that they would possess the digital skills or attitudes needed to meet this daunting change (García-Peñalvo et al., 2020; Hosseini et al., 2021).
Such complex circumstances posed a major challenge for universities, who were forced to deal with enormous changes as a result of having to manage and integrate available technological resources. This was true for both teachers and students alike, who had to familiarize themselves in record time with new technologies as the only way to access teaching and learning (García-Planas & Taberna, 2021; Torrecillas-Bautista, 2020). From the specific view point of teaching, this meant transforming courses which were designed to be taught face-to-face into an online system (Śliwa et al., 2021), with improvised measures of content and adaptation of students’ skills as well as the redesigning of assessment criteria (Hodges et al., 2020; Hosseini et al., 2021). In this sense, transferring content to the digital format and modifying the teaching–learning scenario—from an onsite classroom to a virtual one—cannot be considered digital transformation. Real online teaching involves the right planning, in which the main strategy is the online education in itself (Aldhahi et al., 2022; García-Planas & Taberna, 2021). Therefore, what really happened during the COVID-19 pandemic was a sudden change to an Emergency Remote Teaching (ERT) model (Aldhahi et al., 2022; Hodges et al., 2020).
This was the context faced by Spanish universities—including the University of Murcia (the main focus of this research)—from the second semester of the 2019–2020 academic year, in which universities had to integrate online teaching as the only means of learning. In this sudden and uncertain scenario, in which universities needed to resume teaching immediately (Gopal et al., 2021; Kuhfeld & Tarasawa, 2020) and with no time to design specific learning and assessment systems for the online model (García-Aretio, 2021; García-Peñalvo et al., 2020; Grande-de-Prado et al., 2021; Zubillaga & Gortazar, 2020), previous studies have reported several losses in learning (Ardington et al., 2021; Hevia et al., 2022; Kaffenberger, 2021; Kuhfeld & Tarasawa, 2020), leading to a decline in students’ academic performance. Nevertheless, the literature has also pointed out that this decline may be partially mitigated by implementing continuous online self-assessment systems (Selvaraj et al., 2021), since these activities tend to increase student motivation by providing them with continuous feedback on their performance, helping them to control their own learning process and to set new expectations (Carrasco-Hernández et al., 2020; Sánchez-Marín et al., 2018). Self-assessment activities involve specific evaluation tests that allow students to revise and assess their course progress in line with the specific criteria established (Andrade & Valtcheva, 2009). The aim of these criteria is to “identify learner strengths and weaknesses in their own learning process, improve their learning and take responsibility for their learning” (Bayrak, 2022, p. 642).
Given the above, this paper seeks to analyze how the COVID-19 pandemic impacted student performance as a result of the change in the learning environment. The study also examines what impact continuous self-assessment activities—based on the Kahoot platform—had on student performance, comparing the effect in an onsite and in an online context. To empirically analyze these aspects, we focus on the Degree in Labor Relations and Human Resources at the University of Murcia during the academic years 2018–2019 and 2019–2020, drawing on a sample of 505 students who took the course in Work Organization. This study contributes to prior literature by providing evidence of how academic performance in university students is poorer when the method of instruction is an ERT model (i.e., under the teaching–learning context adopted during COVID-19) compared to pre-pandemic onsite teaching. In addition, our research provides new evidence about how online self-assessment activities influence academic performance (both in online and onsite environments) in terms of potentially counterbalancing the negative influence of an ERT model, thereby showing the importance of designing complementarities among online activities in this new teaching–learning environment (Selvaraj et al., 2021).
The work is structured as follows. First, a literature review is carried out in order to contextualize ICT in university teaching from its origins up to the present. The research hypotheses are then explained and supported based on previous theoretical and empirical evidence. Third, the methodology section offers details of the sample, data, variables, and analyses used to test the hypotheses. Fourth, the results obtained are then provided. The last section offers a discussion of the main conclusions, together with the limitations and future lines of research.

2 Research Perspective

2.1 Student Academic Performance Onsite and Online

Over time, and as information and communication technologies (ICT)—defined as human interaction via the use of technological devices (Reddy et al., 2022)—have progressed and developed in the world of training and education, online teaching methods have gained ground and have improved, thereby enabling distance learning environments to be created that allow flexibility in traditional face-to-face teaching (Fainholc, 2008). This has been especially important in the world of university education, where the introduction of innovative methods has taken on a key role, seeking not only to enhance student motivation, but also in an attempt to place students at the heart of their teaching–learning process, endowing them with a more active and independent role (Chien-Hung et al., 2014; Delgado-García & Oliver-Cuello, 2009; García-Beltrán et al., 2016; Salinas, 2004). Added to this is the globalization which has also meant that universities must adapt to the new social reality if they are to avoid being left behind and losing competitiveness. In this regard, university higher education institutions are increasingly offering potential students the chance to choose any kind of teaching, regardless of where they live; in fact, according to one study carried out by the International University of La Rioja (UNIR) in 2018, online education has grown by 900% at a global scale since the year 2000 (UNIR, 2018).
However, not all universities have the same “open orientation” towards online teaching methods. Some argue that onsite teaching offers unrivalled advantages compared to online teaching since it allows for personal contact with the teacher, social interaction with classmates, and an atmosphere in which it proves easier to persevere as well as retain attention and interest (Aguilar-Gordón, 2020; García-Peñalvo et al., 2020; Jiménez-Galán et al., 2021). It is also believed that online teaching is characterized by a lack of social interaction, forcing the student to take greater responsibility for their own learning, to make a greater effort in terms of organization and self-regulation, and to seek intrinsic elements of motivation from which to draw support (Gherheș et al., 2021; Selvaraj et al., 2021; Śliwa et al., 2021). Nevertheless, the spread and development of online contexts has changed the way online teaching is viewed socially, given that many of the associated limitations have gradually begun to disappear due to the enormous potential it can currently offer users in terms of time and space flexibility, personalized follow-up of their educational progress, range of available courses, or online interaction with teachers and classmates (Gherheș et al., 2021). In fact, based on the achievement goal theory (Elliot, 2005), face-to-face teaching methods encourage students to accomplish their academic goals (Clayton et al., 2010) while online teaching methods can also offer useful tools and resources to increase students’ academic performance (Azlan et al., 2020; Gopal et al., 2021). Table 1 shows the main characteristics associated with onsite and online teaching–learning environments.
Table 1
Onsite versus online environments: main features
Source: Own elaboration based on Aguilar-Gordón (2020), Carrillo (2021), García-Peñalvo et al. (2020), Gherheș et al. (2021), Jiménez Galán et al. (2021), Selvaraj et al. (2021), Śliwa et al. (2021), Soler et al. (2018)
 
Onsite environments
Online environments
Description
Teaching takes place in a physical classroom
Teaching takes place through digital media
Communication type
Synchronous
Synchronous and/or asynchronous
Learning pace and leading role
Leading role of the teacher, traditionally through the lecture
Students have a more passive role (notwithstanding the increasingly widespread use of active methodologies)
Leading role of the student, who is responsible for their own learning, for being self-taught and for playing an active role
The teacher acts as a guide
Material support
Predominantly physically based
Digital
Main tools
Lecture, interactive presentations, face-to-face tutorials
When implementing active methodologies, typical tools used in online environments could be adapted
Digital tools for digital content management, communication, and collaboration tools (such as forums and chats), digital monitoring and evaluation tools (such as self-assessment and peer evaluation) are prominent
Student identity verification tools should be implemented
Assessment tasks
Assessment tasks are basically based on exams (oral or written), practical cases, workshops, and/or student observation
The range of assessment is expanded. Beyond virtual exams and practical exercises, digital tools for assessment learning, results, processes, and competencies are prominent (e.g., forums, wikis, portfolios, project development, collaborative work between students from different countries)
There is a need to properly authenticate and verify the identity of the student
Teaching–learning approach
Focused on content if it is only based on the lecture; or focused on learning if the teacher encourages the active participation of the student and guides them in their learning
Focused on content if limited to providing materials and assessing them; or focused on learning if the potential of digital tools is used
Advantages
Closer relationship with the teacher
Promotes human contact and socialization through day-to-day coexistence
Improved communication skills (verbal and non-verbal)
Greater flexibility, accessibility and autonomy
Elimination of geographical barriers and a more varied offer of studies
Greater dynamism
Greater speed and ease when updating digital materials
More adapted to the student’s own pace
Disadvantages
Higher economic costs
Greater rigidity at the schedule level, which makes it more difficult to achieve a student/family life balance
Greater difficulty and more time required to update materials—normally analog
All students (regardless of their needs) must adapt to the pace of the group
Demands greater self-discipline
Less direct contact with the teacher
Increased difficulty socializing
Greater complexity for students not adapted to ICTs and/or students with few resources
It can generate some reluctance from certain teachers and students
Greater difficulty for students with special needs
Whatever the case, the COVID-19 pandemic forced universities to close their classrooms and to move their teaching models to an online environment (Aldhahi et al., 2022; Aykan & Yıldırım, 2022; Mali & Lim, 2021). Although online education was one “option” open to students and teachers in certain courses or degrees, the onset of COVID-19 meant that online teaching was not an option but rather the “only” method, because of the prevailing circumstances (Hosseini et al., 2021; Kaffenberger, 2021; Mali & Lim, 2021). With little or no room for maneuver, it also had to be implemented urgently if the rules concerning the planning and programming that characterize online teaching environments were to be applied (Aldhahi et al., 2022). However, the change from the traditional teaching–learning context to the online one adopted during COVID-19 caused no small amount of chaos, leading to problems that affected students’ academic performance (for example, by making contact and interaction between teachers/professors and students more difficult) (Śliwa et al., 2021).
According to Hodges et al. (2020), we should not confuse a transition towards an online teaching model with the change to an ERT model (Aldhahi et al., 2022). Developing an online teaching model implies designing and planning that involves enormous complexity, given the huge number of critical dimensions and variables to be considered. These should include (Cabero-Almenara, 2006): how content is presented, the role of both teacher and student, the communication tools to be used, the teaching strategies to be followed, the assessment systems to be applied, e-activities as well as different organizational factors. In addition, each variable offers different options and there must be consistency between all of the decisions taken regarding all of them, which highlights the enormous complexity involved when transforming the onsite teaching model into an online one.
The abruptness with which universities were forced to close and the need to resume teaching immediately left the former with no time to design an education system that requires experience and years of development (Aldhahi et al., 2022; García-Aretio, 2021; García-Peñalvo et al., 2020; Grande-de-Prado et al., 2021; Zubillaga & Gortazar, 2020). In most cases, the transition involved digitizing the content that had already been prepared, replacing onsite lessons with virtual synchronous or asynchronous lessons (García-Peñalvo et al., 2020), and applying an online teaching method based on the pedagogical design of face-to-face teaching (García-Aretio, 2021). In sum, the educational model most commonly implemented by universities was not a real online teaching model but an ERT model aimed at providing temporary teaching and educational distance support or making use of means that proved easy to organize whilst also being available in a reliable manner during an emergency (Aldhahi et al., 2022; Hodges et al., 2020). This unexpected transition towards an ERT model can be a hindrance to one of the main challenges facing the world of universities, i.e., implementing methodologies that allow teaching–learning processes to be improved by developing active methodologies (Granados-Romero et al., 2022; López-Gutiérrez et al., 2022). The favorable effects that result from applying an online teaching method with active methodologies—as stated by the achievement goal theory (Elliot, 2005; Gopal et al., 2021)—might therefore disappear in an ERT context.
Considering all of the above, it seems logical to think that the results achieved with the ERT model applied as a result of the pandemic would differ (unfavorably) from what is to be expected from a real online or onsite education model. In fact, prior literature has identified several learning losses after the closure of classrooms due to the pandemic (Ardington et al., 2021; Hevia et al., 2022; Kaffenberger, 2021; Kuhfeld & Tarasawa, 2020), and which negatively impact students’ academic performance. This negative impact is because universities lacked the necessary room for maneuver to make an effective transition to an online environment, but were instead forced to adopt an ERT model overnight. The results obtained during COVID-19 might therefore clearly have been negatively affected as a result of the difficulties involved in adapting to this new context in each of the areas pointed out above and might have led to poorer student performance compared to previous academic years. This leads us to the first hypothesis, which reads as follows:
Hypothesis 1
The academic performance of students onsite is superior to that obtained online during the COVID-19 pandemic.

2.2 The Role Played by Self-assessment Activities

With the onset of the COVID-19 pandemic, two main challenges have arisen regarding assessment tools. On the one hand, university students face the challenge of being able to build their own learning and to have the ability to make decisions—assuming responsibility and the consequences of their acts—(Cruz-Núñez & Quiñones-Urquijo, 2012). Greater student involvement in the assessment process is thus required to promote skills such as individual reflection, critical judgment, or identifying one’s own learning gaps (Rodríguez-Gómez et al., 2011). The dramatic shift towards an ERT model also meant establishing tools that were consistent with this pedagogical strategy, adapting to the features of this model and helping to achieve the goals set out (Selvaraj et al., 2021). Implementing fragmented assessment instruments or ones not aligned with the teaching model might cause tension in the learning process and become mere punitive elements rather than contributing to learning (De Vincenzi, 2020).
In this context, weeks before the end of the 2019/2020 academic year, professor García-Peñalvo (2020a) published a short article entitled “Online assessment: the perfect storm”, referring to the concurrence of a series of circumstances which could significantly aggravate the damage caused in an adverse situation (e.g., an unsuitable transformation of assessment systems; the negative opinion held by many teachers of online assessment; technological gaps, differences in the use of technologies, and unequal digital competence; or the lack of any clear support from political and academic authorities) (García-Peñalvo, 2020a). Added to this is the fact that universities were concerned that it was not possible to detect cheating in online exams, such that many of them implemented several measures designed to “track students’ facial expressions, voice, location and browsing behaviors in computers to avoid cheating” (Dindar et al., 2022, p. 13). In order to deal with this situation, most universities established a series of guidelines and recommendations which, to a large extent, contented those involved and which enabled assessment to go ahead without any major problems.
In addition to introducing a range of assessment tools that enable different skills to be developed in the student in the new university context, self-assessment should also be given a leading role, not only as an evaluation tool but also as a learning tool. This is because self-assessment may be considered as a “formative process” wherein students are able to progressively gauge their own academic progress (Andrade & Valtcheva, 2009; Bayrak, 2022), understand their failings and benefit from this information in order to develop their self-regulatory skills. These activities allow students to assume an active role in the learning process, engage in greater self-monitoring and take more responsibility for their learning process (Cruz-Núñez & Quiñones-Urquijo, 2012; Díaz-Mendoza et al., 2022). Self-assessment activities are thus used for “training” purposes, including sets of questions that help students to better understand the course and so prepare them to pass the final exam (Díaz-Mendoza et al., 2022; Mallén-Broch & Domínguez-Escrig, 2014). Self-assessment activities also allow teachers to identify the trickiest points in each topic so that they can then explain them in greater depth (Díaz-Mendoza et al., 2022).
Given that self-assessment helps to pinpoint weaknesses and strengths, when used regularly it increases both students’ involvement with the course and their academic performance (Carrasco-Hernández et al., 2020; López-Pérez et al., 2011; Sánchez-Marín et al., 2018). Considering the existence of several virtual platforms as well as technological advances which favor effective feedback through self-assessment (Bayrak, 2022), online self-assessment activities were used during the pandemic and have since become increasingly important. These online self-assessment activities can be defined as any electronic self-assessment process in which ICT are used to present and carry out self-assessment activities and tasks as well as to record activities (Rodríguez-Gómez et al., 2011). Online self-assessments are often channeled through apps (like Kahoot, Wooclap, or Moodle) that make gamification a fun way to acquire knowledge and develop skills (Fernández-Arias et al., 2020) and to enhance productivity in different learning contexts (Saleem et al., 2022). This makes the student the true center of their learning process, enhances their commitment and responsibility (Delgado-García & Oliver-Cuello, 2009), and ultimately favors academic performance (Mallén-Broch & Domínguez-Escrig, 2014). In a similar vein, in their meta-analysis, Mula-Falcón et al. (2022) and Yu (2021) evidence that gamification tends to positively impact students’ motivation and academic performance.
In sum, online self-assessment activities are very valuable because, in addition to using both blended-learning and virtual learning contexts, they allow students to acquire both the skills associated with self-assessment (i.e., responsibility, feedback, self-criticism, self-reflection) and those associated with current ICT development (Rodríguez-Gómez et al., 2011). Learning oriented assessment thus becomes particularly important (Carless et al., 2006), with self-assessment activities being expected to have a positive impact on students' academic performance—not only in face-to-face teaching–learning contexts (Díaz-Mendoza et al., 2022; Gimeno-Santos & Gallego-Matas, 2007) but also in online contexts (Mallén-Broch & Domínguez-Escrig, 2014; Rodríguez-Gómez et al., 2011). This is due to the need to engage the student in the assessment process and for the feedback provided by the assessment to prove prospective (Rodríguez-Gómez et al., 2011); in other words, to provide the student with information that is really useful for future tasks or decisions (Carrasco-Hernández et al., 2020). This may mean that the uncertainty and difficulties associated with implementing an ERT model tend to be reduced when such activities are adopted. We thus expect continuous assessment based on self-assessment activities to improve students’ academic performance, both in onsite as well as in online environments. Based on these arguments, the second hypothesis is stated as follows:
Hypothesis 2
Continuous assessment based on self-assessment activities improves student academic performance, both in onsite as well as in online environments.

3 Methodology

3.1 Sample and Data

Based on the previous literature addressing academic and teaching performance (Carrasco-Hernández et al., 2020; Iwamoto et al., 2017), this study employs the convenience sampling technique, which proves useful for creating homogeneous teaching research environments that enable comparisons of student performance to be drawn (Sukri et al., 2020). In the field of educational academic research, researchers often make use of convenience sampling (Iwamoto et al., 2017), since this technique—as opposed to random sampling—is useful for studies oriented to teaching methodologies (Yoo & Donthu, 2001). The main reason is that when studying the application of active methodologies—such as Kahoot—there is no list of courses that a priori determine whether they use Kahoot within their teaching methodologies. Convenience sampling thus allows the focus to fall on those courses which meet a particular study criteria (use of the Kahoot app, in our case) (Sabbah, 2015). Moreover, these environments are controlled by the researchers, who tend to be teachers in these environments (in this case, in the course on Work Organization). The researchers are also very familiar with both the courses and the students involved (Iwamoto et al., 2017). Researcher participation means that the chosen sample of students is closely connected to the development of the research, such that the research data are reliable and easy to collect (Denscombe, 2003). Convenience sampling thus becomes an attractive tool to carry out studies in certain fields such as teaching methodologies.
Taking the above into account, the sample is made up of 505 students enrolled on the course in Work Organization in the Degree in Labor Relations and Human Resources at the University of Murcia in Spain.1 The aim of this course is to introduce first-year students to work organization issues through practical models, concepts, and graphic tools, providing them with a comprehensive vision and various tools to organize work in any organization—whatever the activity. Specifically, the four groups that take this course were analyzed during the academic years 2018/2019 and 2019/2020. Since Work Organization is taught in the second semester, and teaching for this course was delivered online during the second academic year, the 2018/2019 academic year is considered a “pre-COVID” year whereas the 2019/2020 academic year is considered as a “COVID” one. The learning difficulty in terms of course content remained the same in both data collection timeframes. The specific distribution of the 505 students who made up the sample was as follows: (a) academic year 2018/2019: 70 in group A, 69 in group B, 44 in group C, and 70 in group D; and (b) academic year 2019/2020: 75 in group A, 76 in group B, 56 in group C, and 45 in group D. A summary of the course details can be seen in Table 2.
Table 2
Course details for Work Organization
General information
• Degree: Labor Relations and Human Resources
• Year: first
• Type: compulsory
• Number of groups: four
• Term taught: second
• Type of teaching: campus-based course
Course contents
Contents are divided into the following topics:
 1. Introduction to Work Organization
 2. Production systems planning
 3. Project management
 4. Quality management
 5. Just-in-time
 6. Improvement of methods and times
Teaching methods
It is comprised of the following learning activities:
 • Lectures (workload: 30 h)
 • Individual and group tutorships (workload: 6 h)
 • Practical lessons, seminars, and learning by projects (workload: 24 h)
 • Independent learning (workload: 90 h)
Assessment system
(A) Final exam (55%):
 • Part I (15%): A multiple-choice test with four alternative answers, which will include up to 30 questions, whose score can range from 0 to 10
 • Part II (40%): Three open questions (including theory and practical issues), whose score can range from 0 to 10
(B) Self-assessment and practical activities (40%):
 • Self-assessment activities (25%): Several multiple-choice tests (which came to six) of four alternative answers, which will include up to 10 questions per test, whose score can range from 0 to 10 per test
 • Practical activities (15%): Practical activities scheduled during the semester will be evaluated. These activities are announced at the beginning of the course, and the score can range from 0 to 10 per activity
(C) Student participation (5%): refers to student participation in the different activities scheduled during the semester
Data for the study were taken from three sources: (1) surveys carried out directly in the classroom, asking students for their permission to use information from the academic database for the purposes of the study (specifically, names, surnames, class group, place of residence, and previous experience in the course); (2) examinations and self-assessment tests conducted by the students during the two academic years analyzed, and which enable us to measure their academic performance as well as participation and grades from the continuous assessment tests; and (3) from the University of Murcia Virtual Classroom, which allows us to verify the personal information obtained through the surveys (cross-checking all the data collected from students with the data contained on the university database) as well as gather the other details required to carry out our analyses (e.g., students’ previous experience on the course, and class group). Information was collected during the second semester of the 2018/2019 academic year—between January 2019 and June 2019—and during the second semester of the 2019/2020 academic year—between January 2020 and June 2020. Data in this study were used anonymously and aggregately.

3.2 Variables

The definition and measurements of the variables used in the empirical study are shown in Table 3. Broadly speaking, learning outcomes are measured through three different variables (academic performance, success rate, and performance rate). Participation in self-assessment activities as well as the scores obtained in these tests are also measured. Finally, there are four control variables.
Table 3
Definition of variables
Learning outcomes
Academic performance
Measured on a continuous variable (from 0 to 10) that represents the grade obtained by students in the final exam for the course
The final exam has two different parts: The first comprises a multiple-choice test of four alternative answers, which will include up to 30 questions; while the second comprises three open questions (including theory and practical issues)
Success rates
Measured on a continuous scale from 0 to 100% that measures the percentage of students who pass the course out of the total number of students who take the exam and who are assessed
Performance rates
Measured on a continuous scale from 0 to 100% that measures the percentage of students who pass the course out of the total number of students enrolled
Participation and grades in self-assessment
Participation in self-assessment
Measured by counting the number of self-assessment tests taken on the Kahoot platform for each student out of the total number of tests available (which came to six)
Grades in self-assessment
Measured on a continuous variable (from 0 to 10) that represents the grade obtained by the students in the self-assessment tests taken through the Kahoot platform
Each of these activities comprises a multiple-choice test of four alternative answers, which will include up to 10 questions
Several screenshots showing an example of a self-assessment activity are shown in Appendix A
Control variables
Gender
Measured through a dichotomous variable that takes the value 1 when the gender is “female”, and 0 when the gender is “male”
Access to resources
Measured by the distance (in kilometers) from each student’s place of residence to the Faculty of Work Sciences (located on the Espinardo Campus at the University of Murcia) where the degree is taught
Students’ previous experience on the course
Measured through a continuous variable that indicates the number of times the student took the examination (including the current one)
Class group
This refers to the group which each student enrolled in the course on Work Organization belongs to. There are four groups in all (A, B, C or D). The first two are morning groups (A and B), and the latter two are afternoon/evening groups (C and D)
More specifically, as regards self-assessment activities it should be highlighted that throughout the course the students carried out a total of six self-assessment activities through Kahoot. Each Kahoot activity corresponded to one of the course topics (see Table 2) since these activities must be undertaken regularly if they are to prove effective, such that the student progressively checks their level of learning and is able to redirect it (Cruz-Núñez & Quiñones-Urquijo, 2012). Kahoot thus “provides a new way to draw students’ attention, enhance their mastery of knowledge, improve their performance, and better their academic achievements” (Yu, 2021, p. 2). In addition, these self-assessment activities comprise a multiple-choice test of four alternative answers (and a single valid answer), which will include up to 10 questions. From these self-assessments, two different variables are built: (i) participation in self-assessment; and (ii) grades in self-assessment, as defined in Table 3. These activities score up to 20% in the final grade for the course, and are considered a “continuous assessment” tool. Students are highly motivated to participate in these activities such that the questions posed in each of the six self-assessments were similar to the multiple-choice questions set in the final exam. Self-assessment activities thus help students to better understand the course and to better prepare themselves to pass the final exam, which ultimately helps boost their academic performance (Díaz-Mendoza et al., 2022).

3.3 Analysis

SPSS v.25 software was used to carry out the analysis. Certain statistical tests were previously conducted to ensure the quality of the data analyzed. First—and following the recommendations of Hair et al. (2019)—we made sure there were no suspicious response patterns or common method bias problems (Podsakoff et al., 2003). Second, we ensured that none of the variables used evidenced a high number of lost cases, such that it should not be excluded as a result of not reaching the 15% limit of lost cases (Hair et al., 2019). Third, no problems were found concerning non-response bias—following the procedure of Hair et al. (2013)—nor were there unusual cases with anomalous values that exceed the established limits. Finally, we tested univariate measures of asymmetry and kurtosis by applying the Kolmogorov–Smirnov and Shapiro-Wilks tests (Hair et al., 2010). The tests carried out show normal univariate deviations—with absolute asymmetry and kurtosis values below 1—indicative of normal data distribution.
To test the hypotheses—and given the normality of the variables—we introduced parametric tests (Hair et al., 2010). We performed t tests of independent samples and multiple linear regression models with non-parametric tests of confidence intervals of the coefficients. Prior to examining the t test results, we verified equality of variance amongst the students who took the course onsite and those who took it online by using the Levene test (Hair et al., 2010). Using linear regression models, we conducted error normality and homoscedasticity tests (Hair et al., 2010), and found that the parameters are within the limits accepted in the literature.

4 Results

4.1 Descriptive Statistics and Correlations

Table 4 shows the student profile in the sample. Most students (57.8%) were aged between 18 and 20 and were enrolled on the course for the first time (73.9%). Over half of the students were female (56.2%), which is slightly above the figure published in the Conference of Rectors of Spanish Universities (CRUE) report on Spanish universities in figures (Hernández-Armenteros & Pérez García, 2019), where the figure was 54.8%.
Table 4
Profile of students in the sample
 
Student profile
Group A
Group B
Group C
Group D
Total
2018/2019 academic year
Students
70
69
44
70
253
Gender
 Male
50.0%
44.9%
36.4%
34.3%
41.9%
 Female
50.0%
55.1%
63.5%
65.7%
58.1%
Age
 18–20 years
61.4%
68.1%
27.3%
71.4%
60.1%
 21–25 years
31.4%
24.6%
59.1%
24.3%
32.4%
 26 years old or more
7.1%
7.2%
13.6%
4.3%
7.5%
Experience on the course
 First time enrolled
71.4%
60.9%
56.8%
85.7%
70.0%
 Second time
18.6%
26.1%
34.1%
14.3%
22.1%
 Third time or more
10.0%
13.0%
9.1%
0.0%
7.9%
2019/2020 academic year
Students
75
76
56
45
252
Gender
 Male
48.0%
47.4%
46.4%
37.8%
45.6%
 Female
52.0%
52.6%
53.6%
62.2%
54.4%
Age
 18–20 years
61.3%
68.4%
26.8%
60.0%
55.6%
 21–25 years
30.7%
23.7%
57.1%
33.3%
34.9%
 26 years old or more
8.0%
7.9%
16.1%
6.7%
9.5%
Experience on the course
 First time enrolled
76.0%
81.6%
67.9%
86.7%
77.8%
 Second time
10.7%
7.9%
19.6%
8.9%
11.5%
 Third time or more
13.3%
10.5%
12.5%
4.4%
10.7%
Total
Students
145
145
100
115
505
Gender
 Male
49.0%
46.2%
42.0%
35.7%
43.8%
 Female
51.0%
53.8%
58.0%
64.3%
56.2%
Age
 18–20 years
61.4%
68.3%
27.0%
67.0%
57.8%
 21–25 years
31.0%
24.1%
58.0%
27.8%
33.7%
 26 years old or more
7.6%
7.6%
15.0%
5.2%
8.5%
Experience with the course
 First time enrolled
73.8%
71.7%
63.0%
86.1%
73.9%
 Second time
14.5%
16.6%
26.0%
12.2%
16.8%
 Third time or more
11.7%
11.7%
11.0%
1.7%
9.3%
Table 5 shows the main statistical descriptives as well as the correlations between them. Students’ academic performance in Work Organization (measured by the final grade obtained in the examination) is around 6, although data vary substantially. With regard to the self-assessment tests taken through Kahoot, students took an average of between three or four tests, with the marks for these tests averaging between 5.2 and 7.3. Furthermore, students live an average of 38 km from the Faculty of Work Sciences, were mainly female and were taking the course for the first time. Finally, prominent amongst the correlations are those between academic performance and participation/marks in the self-assessment activities. Also worthy of note were the correlations between participation in Kahoot tests and the marks obtained therein.
Table 5
Mean, standard deviation, and Pearson correlations
 
Mean
SD
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(1) Access to university resources
38.01
46.64
1.00
         
(2) Gender
0.56
0.50
0.06
1.00
        
(3) Previous examinations taken
1.13
0.91
0.03
− 0.01
1.00
       
(4) Self-assessment mark 1
5.65
1.59
− 0.04
− 0.07
0.15*
1.00
      
(5) Self-assessment mark 2
5.23
2.59
− 0.07
− 0.06
0.14*
0.47*
1.00
     
(6) Self-assessment mark 3
6.53
2.49
− 0.02
− 0.18*
− 0.01
0.28*
− 0.07
1.00
    
(7) Self-assessment mark 4
7.03
2.34
− 0.02
− 0.06
0.06
0.39*
0.03
0.77*
1.00
   
(8) Self-assessment mark 5
7.33
1.75
− 0.01
− 0.05
0.05
0.41*
0.49*
0.13*
0.28*
1.00
  
(9) Self-assessment mark 6
6.20
1.98
− 0.18*
− 0.07
0.05
0.40*
0.36*
0.39*
0.37*
0.41*
1.00
 
(10) Participation in Kahoot
3.59
2.38
0.05
− 0.02
0.12***
0.29***
− 0.01
0.52***
0.34***
0.04
0.34***
1.00
(11) Academic performance
5.95
2.37
0.01
− 0.01
− 0.32*
0.29*
0.49*
− 0.02
0.03
0.36*
0.37*
0.21***
***p < 0.01; **p < 0.05; and *p < 0.10
As regards the measures taken to prevent students from cheating in exams, the University of Murcia—before starting any online exams (which are conducted through the “exams” tool of the virtual classroom)—requires individual admission through the videoconference tool where each student needs to show their student ID card or passport and to turn on their computer camera and microphone in order to have their identity verified. All students had to keep the videoconference session open, and their camera and microphone activated until the end of the exam—allowing the teacher to view/monitor the student’s face and/or hands at any time. Voluntary exit from the application or disconnection from the camera was deemed to be equivalent to voluntarily leaving the examination. After that, students started their session in the virtual classroom and accessed the “exams” tool to start their examination. Moreover, as regards the multiple-choice test questions, both questions and answers were randomized so as to minimize opportunistic behavior.

4.2 Hypotheses Testing

Table 6 shows the results of the t tests on the differences between the mark students obtained in the examination for the course in Work Organization, comparing the 2018/2019 and 2019/2020 academic years. Specifically, the first two columns show information concerning the mark obtained in the examination, group size, and standard deviation in terms of academic year, while the following two columns provide information on the t statistic and the significance value (p). As regards the rows, the first row in the table shows the result for the whole sample, while the following four rows offer information for each of the groups examined. Considering the results, it can be seen that the marks obtained in the 2018/2019 (pre-COVID) academic year are higher than those for the 2019/2020 (COVID) academic year. Specifically, t test values are statistically significant in all the groups, except in group C. This supports the first hypothesis, given that the marks obtained by students in the onsite format are higher than those obtained in the online format adopted during the COVID-19 pandemic.
Table 6
Differences in academic performance between pre-COVID (2018/2019) and COVID academic years (2019/2020)
Academic performance
Final course mark
Final mark
Statistical test
Sample
2018/2019
2019/2020
Student t
Sig
Pre-COVID
COVID
Total
6.66
(167 | 2.34)
5.22
(156 | 2.18)
5.81
0.00
Group A
7.25
(47 | 1.60)
5.42
(50 | 2.46)
4.35
0.00
Group B
6.43
(47 | 2.82)
4.91
(49 | 1.85)
3.12
0.00
Group C
6.08
(29 | 2.95)
5.11
(33 | 2.66)
1.36
0.18
Group D
6.68
(44 | 1.88)
5.49
(34 | 1.67)
2.90
0.01
Mean mark (N | Standard deviation)
Table 7 also shows the success rate and performance rate of students for the course in Work Organization. From the total of 505 students enrolled, 253 correspond to the 2018/2019 academic year. Of the 167 students who took the exam in June, 128 passed. Moreover, of the 252 who correspond to the 2019/2020 academic year, 166 took the exam and 96 passed in June. Taking the number of students who passed the exam out of the number of students enrolled gives us the performance rate for each academic year, while taking the number of students who passed the exam out of the number of students who sat the exam gives us the success rate. The performance rates and the success rates in the online format adopted during COVID-19 compared to those in the onsite format are worse. These results provide further support to our first hypothesis.
Table 7
Success rates and performance rates by academic year
 
Total
2018/2019
2019/2020
Pre-COVID
COVID
Enrolled [N]
505
253
252
Took the exam [N]
333
167
166
Passed [N]
224
128
96
Performance rate
44.35%
50.59%
38.09%
Success rate
67.26%
76.64%
57.83%
Table 8 shows the regression analysis results for the 2018/2019 academic year (pre-COVID) [model 1] and for the 2019/2020 academic year (COVID) [model 2], where the reported VIF values indicate low levels of multicollinearity (Hair et al., 2010, 2019). Specifically, after testing model 1, coefficients indicate that both participation in self-assessment activities (β = 0.32, p < 0.01) as well as the marks obtained in the self-assessment tests (β = 0.32, p < 0.01) exert a positive and significant effect on the final mark for the course. Results also show that students who had prior experience with the course obtained worse marks (β =  − 0.22, p < 0.01) and that, as regards student gender, women attained higher marks in the final examination (β = 0.16, p < 0.05). Results for the 2019/2020 (COVID-19) academic year are shown in model 2. Coefficients show that participation in self-assessment activities (β = 0.18, p < 0.01) and the mark obtained in the self-assessment tests (β = 0.46, p < 0.01) improve student academic performance in the online modality. Moreover, students who were not repeating the course (β =  − 0.29, p < 0.01) as well as female students (β = 0.17, p < 0.01) gained the best marks, similar to what was found in model 1. In sum, considering the results, it can be stated that the participation and marks to emerge from self-assessment enhance student performance both in the onsite (2018/2019 academic year) and in the online teaching format (2019/2020 academic year), thereby supporting our second hypothesis.
Table 8
Effect of participation and self-assessment activity marks on students’ academic performance
 
Model 1
Model 2
Academic performance
Academic performance
Academic year 2018/2019
Academic year 2019/2020
Pre-COVID
COVID
Standardized coefficients
VIF
Standardized coefficients
VIF
Access to resources
0.14**
1.00
0.04
1.04
Previous experience with the course
− 0.22***
1.10
− 0.29***
1.11
Student gender
0.16***
1.03
0.17***
1.01
Mark in self-assessment
0.32***
1.10
0.46***
1.08
Participation in self-assessment
0.32***
1.03
0.18***
1.13
F
20.34
 
31.80
 
R2
0.38
 
0.30
 
***p < 0.01; **p < 0.05; and *p < 0.10

5 Conclusions and Discussion

The COVID-19 pandemic posed a major challenge to all institutions and degree courses in the education sector, especially universities (Gopal et al., 2021; Hosseini et al., 2021; Mali & Lim, 2021). In a very short space of time, conventional onsite teaching systems were forced to adapt to an online model of teaching. The widely implemented ERT model (as opposed to a real online model) strongly influenced the teaching–learning process, with potential effects on student academic performance. Considering this context, we examine what impact changing from an onsite model to an ERT model had on student academic performance and what role self-assessment activities played in this linkage.
Using a sample of 505 students enrolled on the course in Work Organization at the University of Murcia, our findings show that implementing an ERT model after the outbreak of the COVID-19 pandemic had a negative impact on student academic performance. As expected, since an ERT model is far from being a true online teaching model (Aldhahi et al., 2022; Hodges et al., 2020), the teaching–learning process proved to be a trial and error process—with negative consequences (Aguilar-Gordón, 2020). While the literature reports generally positive effects of implementing planned teaching–learning online models in higher education environments, our results nuance that this implementation must be parsimonious if it is to be successful, and should not involve drastic changes (Gopal et al., 2021). The lack of sufficient content planning, adaptation of methodologies and activities, and adjustment of assessment systems to this new abruptly imposed context may have negatively impacted the teaching–learning process and have adversely affected students’ academic performance (Aldhahi et al., 2022; García-Aretio, 2021; García-Peñalvo et al., 2020; Zubillaga & Gortazar, 2020).
In addition, our findings provide evidence concerning the positive influence of self-assessment activities on students’ academic performance—both in onsite and online environments. In line with previous literature that had already reported the positive impact which self-assessment activities have in face-to-face, blended, and online teaching environments (Carrasco-Hernández et al., 2020; Fernández-Arias et al., 2020; Saleem et al., 2022), our study extends this positive influence in a context of ERT. In order to partially counteract the negative effects that an abrupt change from face-to-face to an online learning model has on students’ academic performance, implementing appropriate assessment tools becomes a key aspect in terms of generating student self-motivation (Jiménez-Galán et al., 2021) that leads to better student performance and grades. In addition, self-assessment activities—commonly associated with active methodologies and which place special emphasis on students as the center of the teaching–learning process (Cruz-Núñez & Quiñones-Urquijo, 2012; Díaz-Mendoza et al., 2022)—are more flexible vis-à-vis adjusting different learning environments. This is consistent with the notion that this kind of activity is closer to online teaching-assessment contexts than others applied in the pre-COVID period. Moreover, Kahoot has a “competitive option” (which can be assimilated to online serious games), which encourages the participation of students who—through these games and many times unconsciously—end up improving both their learning process and academic performance, which was especially helpful during the pandemic. We thus obtain robust results that show how the negative effects of the ERT model on students’ performance are reduced when these self-assessment activities are used regularly.

5.1 Practical Recommendations

Based on our findings, several practical implications emerge from this study. First, even though teachers and students have all striven to adapt to the new online format, there is a clear need to improve ICT-related skills. This calls into question university compliance with the principles of the Bologna Declaration (1999), whose principal objective was to ensure that student learning should not be based solely on acquiring theoretical knowledge but should also include a command and development of skills and abilities. Although universities generally provide students with an appropriate level of knowledge content, there is room for improvement in terms of endowing students with the skills demanded by business as well as by society itself (Vinichenko et al., 2017). Self-assessment activities may go some way towards providing students with part of the skills required to help them become autonomous and flexible so that they can better adapt to unforeseen contingencies such as those to emerge during the pandemic.
Second, online teaching has highlighted a series of aspects or gaps to be taken into consideration when seeking to make future improvements (Fernández-Enguita, 2020): (a) the so-called “use gap”, which refers to time, since a household in which several students taking online lessons live together may not have the financial resources to enable each of them to have their own device for personal use at the same time, thus forcing them to share; (b) the “access gap”, since it is not always possible to have electronic devices of sufficient quality or to have a good enough quality internet connection (Avanesian et al., 2021); and (c) the “skills gap”, which is linked to the lack of digital and ICT skills on the part of both teachers and students. It is essential to try to neutralize these gaps, since learning losses are exacerbated as inequalities increase (Ardington et al., 2021). In this way, governments and public institutions must promote measures aimed at reducing such gaps in order to take advantage of the benefits derived from online environments—bearing in mind that some of the measures implemented during the pandemic will remain in the years ahead.
Third, there is also an urgent need to regulate offsite assessment processes (García-Peñalvo, 2020b). While the shift in teaching has entailed a deep-rooted transformation and has posed a major challenge, designing examinations for an offsite context remains a vital area to be addressed, and entails the need for assessment to be analyzed and planned to a greater degree, above all given onsite universities’ lack of institutional experience. As a result, new planning is required for a new teaching–learning model to adapt to a scenario in which ICT will not merely offer support tools for face-to-face lessons but will become a key element in the development of teaching (Marín-Díaz et al., 2022). As a consequence, both teachers and students will need to improve their digital skills in order to adapt to this structural change (García-Planas & Taberna, 2021; Torrecillas-Bautista, 2020).
Finally, students with special educational needs—who tend to need specific plans and adaptations in their teaching–learning process (García-Peñalvo et al., 2020)—should be taken into consideration more. During the COVID-19 pandemic—and because an ERT model was implemented—their performance was particularly affected since there was no real possibility of adequately adapting to their individual needs. In this way—and in order to satisfactorily deal with any unexpected situation that may arise in the future—academic authorities should design more flexible and proactive plans for students with special educational needs so that they are not disadvantaged when faced with any new abrupt changes in their teaching–learning.

5.2 Limitations and Lines for Future Research

This article is not exempt from limitations, from which some lines of future research arise. First, this article shows that students’ academic performance during the pandemic was lower compared to the face-to-face context (prior to COVID-19). However, it is necessary to gauge the scale of the learning losses suffered by students at different educational levels in the medium and long term. Related to this, it is also vital to ascertain in which contexts and under which factors learning losses are more pronounced. Second, this study focuses on a specific university and on a particular course. Bearing in mind that not all courses are equal, and that teaching requirements and strategies may differ for applied courses compared to theory courses, it would be interesting to extend this study to other courses (both practical and theoretical). Moreover, future studies should explore how students’ academic performance evolved over more years after COVID-19 by considering other contexts, other courses, and other assessment tools. All of this would increase the generalizability of our results. Third, this study employs a convenience sampling technique whose main drawback is that the results of the study might lack generalization due to sample bias. In any case, and as stated by Emerson (2021), “the use of a larger sample through convenience sampling allows for slightly greater generalization”. For this reason, our study uses a large sample to try to partially offset the drawbacks of this convenience sampling. In future, it would be advisable to carry out research on a wider group of courses, and from different fields of knowledge so as to counterbalance this limitation. Fourth, this paper measures academic performance through the grade obtained by students in the final exam for the course. However, future studies could consider other dimensions (e.g., class attendance, student satisfaction, professor satisfaction, adequacy of the contents, external evaluations), which could enrich the evidence obtained. Lastly, further inquiry is necessary in order to determine whether—over time—improvements have been made in the design and implementation of methods and tools for the delicate period the world was forced to endure and which, as seen, has negatively impacted university education.

5.3 Concluding Remarks

Drawing on a sample of 505 students enrolled on the course in Work Organization at the University of Murcia in Spain, this study finds that the marks obtained by students during the 2018/2019 academic year—characterized by face-to-face teaching—were significantly higher than those obtained during the 2019/2020 academic year—when teaching was delivered online. Nevertheless, both in onsite and online teaching contexts, findings indicate that engaging in self-assessment activities such as those performed through Kahoot—in which ICT-related skills were already being applied—helps to enhance student academic performance. Thus, despite the negative impact that implementing an ERT model has on students’ academic performance, online self-assessment activities do tend to improve said performance, with this online environment playing a positive role if supported by adequate planning of self-assessment activities during the course. Overall, this research contributes to the literature by exploring how self-assessment activities helped to offset the declining student academic performance caused by the abrupt transition from onsite to online teaching during the COVID-19 pandemic.

Declarations

Competing interests

The authors have no competing interests to declare that are relevant to the content of this article.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix

Appendix A

Screenshots showing an example of a self-assessment activity completed on a Kahoot learning platform.
Footnotes
1
Considering the nature and objectives of this research, we previously obtained the individual agreement of students to participate in this study. They gave us permission to use their personal data (name, surname, class group, and previous experience on the course) with our express commitment to use these data anonymously and aggregately.
 
Literature
go back to reference Aguilar-Gordón, F. D. R. (2020). From face-to-face learning to virtual learning in pandemic times. Estudios Pedagógicos, 3, 213–223. Aguilar-Gordón, F. D. R. (2020). From face-to-face learning to virtual learning in pandemic times. Estudios Pedagógicos, 3, 213–223.
go back to reference Aldhahi, M. I., Alqahtani, A. S., Baattaiah, B. A., & Al-Mohammed, H. I. (2022). Exploring the relationship between students’ learning satisfaction and self-efficacy during the emergency transition to remote learning amid the coronavirus pandemic: A cross-sectional study. Education and Information Technologies, 27(1), 1323–1340. https://doi.org/10.1007/s10639-021-10644-7CrossRefPubMed Aldhahi, M. I., Alqahtani, A. S., Baattaiah, B. A., & Al-Mohammed, H. I. (2022). Exploring the relationship between students’ learning satisfaction and self-efficacy during the emergency transition to remote learning amid the coronavirus pandemic: A cross-sectional study. Education and Information Technologies, 27(1), 1323–1340. https://​doi.​org/​10.​1007/​s10639-021-10644-7CrossRefPubMed
go back to reference Azlan, A. A., Hamzah, M. R., Sern, T. J., Ayub, S. H., & Mohamad, E. (2020). Public knowledge, attitudes and practices towards COVID-19: A cross-sectional study in Malaysia. PLoS ONE, 15(5), e0233668.CrossRefPubMedPubMedCentral Azlan, A. A., Hamzah, M. R., Sern, T. J., Ayub, S. H., & Mohamad, E. (2020). Public knowledge, attitudes and practices towards COVID-19: A cross-sectional study in Malaysia. PLoS ONE, 15(5), e0233668.CrossRefPubMedPubMedCentral
go back to reference Bosco-Paniagua, M. A., & Rodríguez-Gómez, D. (2008). Docencia virtual y aprendizaje autónomo: Algunas contribuciones al Espacio Europeo de Educación Superior. RIED: Revista Iberoamericana de Educación a Distancia, 11(1), 157–182. Bosco-Paniagua, M. A., & Rodríguez-Gómez, D. (2008). Docencia virtual y aprendizaje autónomo: Algunas contribuciones al Espacio Europeo de Educación Superior. RIED: Revista Iberoamericana de Educación a Distancia, 11(1), 157–182.
go back to reference Cabero-Almenara, J. (2006). Bases pedagógicas del e-learning. Revista de Universidad y Sociedad Del Conocimiento, 3(1), 1–10. Cabero-Almenara, J. (2006). Bases pedagógicas del e-learning. Revista de Universidad y Sociedad Del Conocimiento, 3(1), 1–10.
go back to reference Carrasco-Hernández, A. J., Lozano-Reina, G., Lucas-Pérez, M. E., Madrid-Garre, M. F., & Sánchez-Marín, G. (2020). Developing new learning tools in the classroom: The Kahoot experience. Journal of Management and Business Education, 3(3), 214–235.CrossRef Carrasco-Hernández, A. J., Lozano-Reina, G., Lucas-Pérez, M. E., Madrid-Garre, M. F., & Sánchez-Marín, G. (2020). Developing new learning tools in the classroom: The Kahoot experience. Journal of Management and Business Education, 3(3), 214–235.CrossRef
go back to reference Carrillo, M. V. (2021). Educational platforms and digital tools for learning. Vida Científica Boletín Científico De La Escuela Preparatoria, 9(18), 9–12. Carrillo, M. V. (2021). Educational platforms and digital tools for learning. Vida Científica Boletín Científico De La Escuela Preparatoria, 9(18), 9–12.
go back to reference Castells, M. (2006). La sociedad red: Una visión global. Alianza Ed. Castells, M. (2006). La sociedad red: Una visión global. Alianza Ed.
go back to reference Chien-Hung, L., Yu-Chang, L., Bin-Shyan, J., & Yen-Teh, H. (2014). Adding social elements to game-based learning. International Journal of Emerging Technologies in Learning, 9(3), 12–15.CrossRef Chien-Hung, L., Yu-Chang, L., Bin-Shyan, J., & Yen-Teh, H. (2014). Adding social elements to game-based learning. International Journal of Emerging Technologies in Learning, 9(3), 12–15.CrossRef
go back to reference Clayton, K., Blumberg, F., & Auld, D. P. (2010). The relationship between motivation, learning strategies and choice of environment whether traditional or including an online component. British Journal of Educational Technology, 41(3), 349–364.CrossRef Clayton, K., Blumberg, F., & Auld, D. P. (2010). The relationship between motivation, learning strategies and choice of environment whether traditional or including an online component. British Journal of Educational Technology, 41(3), 349–364.CrossRef
go back to reference Cruz-Núñez, F., & Quiñones-Urquijo, A. (2012). Importancia de la evaluación y autoevaluación en el rendimiento académico. Zona Próxima: Revista Del Instituto de Estudios Superiores En Educación, 16, 96–104. Cruz-Núñez, F., & Quiñones-Urquijo, A. (2012). Importancia de la evaluación y autoevaluación en el rendimiento académico. Zona Próxima: Revista Del Instituto de Estudios Superiores En Educación, 16, 96–104.
go back to reference De Vincenzi, A. (2020). Del aula presencial al aula virtual universitaria en contexto de pandemia de COVID-19. Avances de Una Experiencia Universitaria En Carreras Presenciales Adaptadas a La Modalidad Virtual. Debate Universitario, 8(16), 67–71. De Vincenzi, A. (2020). Del aula presencial al aula virtual universitaria en contexto de pandemia de COVID-19. Avances de Una Experiencia Universitaria En Carreras Presenciales Adaptadas a La Modalidad Virtual. Debate Universitario, 8(16), 67–71.
go back to reference Denscombe, G. O. (2003). The good research guide for small-scale social research. Open University Press. Denscombe, G. O. (2003). The good research guide for small-scale social research. Open University Press.
go back to reference Díaz-Mendoza, A. C., Meón-Izco, Á. M., Azcona-Ciriza, E., Ballester-Miquel, L., & González-Urteaga, A. (2022). Ejercicios de autoevaluación y mejora del rendimiento académico. In J. M. Ribera Puchades & M. Sáenz-de-Jubera-Ocón (Eds.), La innovación como motor para la transformación de la enseñanza universitaria (pp. 35–40). Universidad de La Rioja. Díaz-Mendoza, A. C., Meón-Izco, Á. M., Azcona-Ciriza, E., Ballester-Miquel, L., & González-Urteaga, A. (2022). Ejercicios de autoevaluación y mejora del rendimiento académico. In J. M. Ribera Puchades & M. Sáenz-de-Jubera-Ocón (Eds.), La innovación como motor para la transformación de la enseñanza universitaria (pp. 35–40). Universidad de La Rioja.
go back to reference Elliot, A. (2005). A conceptual history of the achievement goal construct. In A. Elliot & C. Dweck (Eds.), Handbook of competence and motivation (pp. 52–72). Guilford Press. Elliot, A. (2005). A conceptual history of the achievement goal construct. In A. Elliot & C. Dweck (Eds.), Handbook of competence and motivation (pp. 52–72). Guilford Press.
go back to reference Emerson, R. W. (2021). Convenience sampling revisited: Embracing its limitations through thoughtful study design. Journal of Visual Impairment and Blindness, 115(1), 76–77.CrossRef Emerson, R. W. (2021). Convenience sampling revisited: Embracing its limitations through thoughtful study design. Journal of Visual Impairment and Blindness, 115(1), 76–77.CrossRef
go back to reference Fainholc, B. (2008). De cómo las TICs podrían colaborar en la innovación socio-tecnológico-educativa en la formación superior y universitaria presencial. RIED: Revista Iberoamericana de Educación a Distancia, 11(1), 53–79. Fainholc, B. (2008). De cómo las TICs podrían colaborar en la innovación socio-tecnológico-educativa en la formación superior y universitaria presencial. RIED: Revista Iberoamericana de Educación a Distancia, 11(1), 53–79.
go back to reference Fernández-Arias, P., Ordóñez-Olmedo, E., Vergara-Rodríguez, D., & Gómez-Vallecillo, A. I. (2020). La gamificación como técnica de adquisición de competencias sociales. Prisma Social: Revista de Investigación Social, 31, 388–409. Fernández-Arias, P., Ordóñez-Olmedo, E., Vergara-Rodríguez, D., & Gómez-Vallecillo, A. I. (2020). La gamificación como técnica de adquisición de competencias sociales. Prisma Social: Revista de Investigación Social, 31, 388–409.
go back to reference García-Aretio, L. (2021). COVID-19 y educación a distancia digital: preconfinamiento, confinamiento y posconfinamiento. RIED: Revista Iberoamericana de Educación a Distancia, 24(1), 9–32.CrossRef García-Aretio, L. (2021). COVID-19 y educación a distancia digital: preconfinamiento, confinamiento y posconfinamiento. RIED: Revista Iberoamericana de Educación a Distancia, 24(1), 9–32.CrossRef
go back to reference García-Beltrán, A., Martínez, R., Jaén, J. A., & Tapia, S. (2016). La autoevaluación como actividad docente en entornos virtuales de aprendizaje/enseñanza. Revista de Educación a Distancia, 50, 1–11. García-Beltrán, A., Martínez, R., Jaén, J. A., & Tapia, S. (2016). La autoevaluación como actividad docente en entornos virtuales de aprendizaje/enseñanza. Revista de Educación a Distancia, 50, 1–11.
go back to reference García-Peñalvo, F. J. (2020b). Modelo de referencia para la enseñanza no presencial en universidades presenciales. Campus Virtuales, 9(1), 41–56. García-Peñalvo, F. J. (2020b). Modelo de referencia para la enseñanza no presencial en universidades presenciales. Campus Virtuales, 9(1), 41–56.
go back to reference García-Planas, M. I., & Taberna, J. (2021). Transición de la docencia presencial a la no presencial en la UPC durante la pandemia del COVID-19. IJERI: International Journal of Educational Research and Innovation, 15, 177–187. García-Planas, M. I., & Taberna, J. (2021). Transición de la docencia presencial a la no presencial en la UPC durante la pandemia del COVID-19. IJERI: International Journal of Educational Research and Innovation, 15, 177–187.
go back to reference Gimeno-Santos, M., & Gallego-Matas, S. (2007). La autoevaluación de las competencias básicas del estudiante de psicología. Revista de Psicodidáctica, 12(1), 7–27. Gimeno-Santos, M., & Gallego-Matas, S. (2007). La autoevaluación de las competencias básicas del estudiante de psicología. Revista de Psicodidáctica, 12(1), 7–27.
go back to reference Granados-Romero, J. F., Vargas-Pérez, C. V., & Vargas-Pérez, R. A. (2022). La formación de profesionales competentes e innovadores mediante el uso de metodologías activas. Revista Universidad y Sociedad, 12(1), 343–349. Granados-Romero, J. F., Vargas-Pérez, C. V., & Vargas-Pérez, R. A. (2022). La formación de profesionales competentes e innovadores mediante el uso de metodologías activas. Revista Universidad y Sociedad, 12(1), 343–349.
go back to reference Grande-de-Prado, M., García-Peñalvo, F. J., Corell, A., & Abella-García, V. (2021). Evaluación en Educación Superior durante la pandemia de la COVID-19. Campus Virtuales, 1(10), 49–58. Grande-de-Prado, M., García-Peñalvo, F. J., Corell, A., & Abella-García, V. (2021). Evaluación en Educación Superior durante la pandemia de la COVID-19. Campus Virtuales, 1(10), 49–58.
go back to reference Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2010). Multivariate data analysis: A global perspective (7th ed.). London: Pearson Education. Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2010). Multivariate data analysis: A global perspective (7th ed.). London: Pearson Education.
go back to reference Hair, J. F., Ringle, C. M., & Sarstedt, M. (2013). Partial least squares structural equation modeling: Rigorous applications, better results and higher acceptance. Long Range Planning, 46(1–2), 1–12.CrossRef Hair, J. F., Ringle, C. M., & Sarstedt, M. (2013). Partial least squares structural equation modeling: Rigorous applications, better results and higher acceptance. Long Range Planning, 46(1–2), 1–12.CrossRef
go back to reference Iwamoto, D. H., Hargis, J., Taitano, E. J., & Vuong, K. (2017). Analyzing the efficacy of the testing effect using Kahoot on student performance. Turkish Online Journal of Distance Education, 18(2), 80–93. Iwamoto, D. H., Hargis, J., Taitano, E. J., & Vuong, K. (2017). Analyzing the efficacy of the testing effect using Kahoot on student performance. Turkish Online Journal of Distance Education, 18(2), 80–93.
go back to reference Jiménez-Galán, Y. I., Hernández-Jaime, J., & Rodríguez-Flores, E. (2021). Online education and learning assessment: From face-to-face to virtual. RIDE: Revista Iberoamericana Para La Investigación y El Desarrollo Educativo, 12(23), e259. Jiménez-Galán, Y. I., Hernández-Jaime, J., & Rodríguez-Flores, E. (2021). Online education and learning assessment: From face-to-face to virtual. RIDE: Revista Iberoamericana Para La Investigación y El Desarrollo Educativo, 12(23), e259.
go back to reference Kuhfeld, M., & Tarasawa, B. (2020). The COVID-19 slide: What summer learning loss can tell us about the potential impact of school closures on student academic achievement. In: NWEA white paper. Kuhfeld, M., & Tarasawa, B. (2020). The COVID-19 slide: What summer learning loss can tell us about the potential impact of school closures on student academic achievement. In: NWEA white paper.
go back to reference Laviña, J., & Mengual, L. (2008). Libro Blanco de la Universidad Digital 2010. Ariel. Laviña, J., & Mengual, L. (2008). Libro Blanco de la Universidad Digital 2010. Ariel.
go back to reference López-Gutiérrez, C. J., Stuart-Rivero, A. J., & Sánchez-Salmerón, F. (2022). Metodologías activas en la cultura física. Retos y oportunidades. Revista Universidad y Sociedad, 14(3), 153–160. López-Gutiérrez, C. J., Stuart-Rivero, A. J., & Sánchez-Salmerón, F. (2022). Metodologías activas en la cultura física. Retos y oportunidades. Revista Universidad y Sociedad, 14(3), 153–160.
go back to reference López-Pérez, M. V., Pérez-López, M. C., & Rodríguez-Ariza, L. (2011). Blended learning in higher education: Students’ perceptions and their relation to outcomes. Computers and Education, 56(1), 818–826.CrossRef López-Pérez, M. V., Pérez-López, M. C., & Rodríguez-Ariza, L. (2011). Blended learning in higher education: Students’ perceptions and their relation to outcomes. Computers and Education, 56(1), 818–826.CrossRef
go back to reference Mula-Falcón, J., Moya-Roselo, I., & Ruiz-Ariza, A. (2022). The active methodology of gamification to improve motivation and academic performance in educational context: A meta-analysis. Review of European Studies, 14, 32–46.CrossRef Mula-Falcón, J., Moya-Roselo, I., & Ruiz-Ariza, A. (2022). The active methodology of gamification to improve motivation and academic performance in educational context: A meta-analysis. Review of European Studies, 14, 32–46.CrossRef
go back to reference Rodríguez-Gómez, G., Ibarra-Sáiz, M. S., & Gómez-Ruiz, M. Á. (2011). E-Autoevaluación en la universidad: Un reto para profesores y estudiantes. Revista de Educación, 356, 401–430. Rodríguez-Gómez, G., Ibarra-Sáiz, M. S., & Gómez-Ruiz, M. Á. (2011). E-Autoevaluación en la universidad: Un reto para profesores y estudiantes. Revista de Educación, 356, 401–430.
go back to reference Sabbah, S. S. (2015). The effect of college students’ self-generated computerized mind mapping on their reading achievement. International Journal of Education and Development Using Information and Communication Technology, 11(3), 4–36. Sabbah, S. S. (2015). The effect of college students’ self-generated computerized mind mapping on their reading achievement. International Journal of Education and Development Using Information and Communication Technology, 11(3), 4–36.
go back to reference Sánchez-Marín, G., Lucas-Pérez, M. E., Carrasco-Hernández, A. J., Lozano-Reina, G., & Nicolás-Martínez, C. (2018). The influence of self-assessment activities on student learning outcomes. Journal of Management and Business Education, 1(1), 28–38.CrossRef Sánchez-Marín, G., Lucas-Pérez, M. E., Carrasco-Hernández, A. J., Lozano-Reina, G., & Nicolás-Martínez, C. (2018). The influence of self-assessment activities on student learning outcomes. Journal of Management and Business Education, 1(1), 28–38.CrossRef
go back to reference Soler, M. G., Cárdenas, F. A., & Hernández-Pina, F. (2018). Teaching and learning approaches: Theoretical perspectives to develop research in science education. Ciência and Educação, 24(4), 993–1012.CrossRef Soler, M. G., Cárdenas, F. A., & Hernández-Pina, F. (2018). Teaching and learning approaches: Theoretical perspectives to develop research in science education. Ciência and Educação, 24(4), 993–1012.CrossRef
go back to reference Sukri, S. I. A., Yunus, M. M., Sevakumaran, D., Rajeswari-Chandara-Kumaran, N. M., & Badusah, J. (2020). Kahoot! does wonders: English articles. International Journal of Academic Research in Progressive Education and Development, 9(1), 360–371. Sukri, S. I. A., Yunus, M. M., Sevakumaran, D., Rajeswari-Chandara-Kumaran, N. M., & Badusah, J. (2020). Kahoot! does wonders: English articles. International Journal of Academic Research in Progressive Education and Development, 9(1), 360–371.
go back to reference The European Higher Education Area. Bologna Declaration. (1999). Joint declaration of the European Ministers of Education. The European Higher Education Area. Bologna Declaration. (1999). Joint declaration of the European Ministers of Education.
go back to reference Torrecillas-Bautista, C. (2020). El reto de la docencia online para las universidades públicas españolas ante la pandemia del Covid-19. ICEI Papers COVID-19, 16, 1–4. Torrecillas-Bautista, C. (2020). El reto de la docencia online para las universidades públicas españolas ante la pandemia del Covid-19. ICEI Papers COVID-19, 16, 1–4.
go back to reference Vinichenko, M. V., Ridho, M. V., Kirillov, A. V., Makuchikin, S. A., & Melnichuk, A. V. (2017). Development of skills management in the system management of talents. Moder Journal of Language Teaching Methods, 7(9), 50–57. Vinichenko, M. V., Ridho, M. V., Kirillov, A. V., Makuchikin, S. A., & Melnichuk, A. V. (2017). Development of skills management in the system management of talents. Moder Journal of Language Teaching Methods, 7(9), 50–57.
go back to reference Yoo, B., & Donthu, N. (2001). Developing a scale to measure the perceived quality of an Internet shopping site (SiteQual). Quarterly Journal of Electronic Commerce, 2(1), 31–46. Yoo, B., & Donthu, N. (2001). Developing a scale to measure the perceived quality of an Internet shopping site (SiteQual). Quarterly Journal of Electronic Commerce, 2(1), 31–46.
Metadata
Title
Impact of an Emergency Remote Teaching Model on Students’ Academic Performance During COVID-19
Authors
Antonio Carrasco-Hernández
Gabriel Lozano-Reina
María Encarnación Lucas-Pérez
María Feliz Madrid-Garre
Gregorio Sánchez-Marín
Publication date
07-07-2023
Publisher
Springer Netherlands
Published in
Technology, Knowledge and Learning / Issue 1/2024
Print ISSN: 2211-1662
Electronic ISSN: 2211-1670
DOI
https://doi.org/10.1007/s10758-023-09665-7

Other articles of this Issue 1/2024

Technology, Knowledge and Learning 1/2024 Go to the issue

Premium Partners