Next Article in Journal
Emergency Medical Providers’ Knowledge Regarding Disasters during Mass Gatherings in Saudi Arabia
Next Article in Special Issue
Motivation and Evaluation in Education from the Sustainability Perspective: A Review of the Scientific Literature
Previous Article in Journal
Developing Educational Games for Preschool Children to Improve Dietary Choices and Exercise Capacity
Previous Article in Special Issue
Motivation for Learning among Students Undertaking Basic Vocational Training and University Studies within the Context of COVID-19
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Relationship between Learning Styles and Academic Performance: Consistency among Multiple Assessment Methods in Psychology and Education Students

1
Department of Psychology, Universidad Loyola Andalucía, Avda. de las Universidades s/n, 47704 Dos Hermanas, Spain
2
Department of Psychology, Universidad de Jaén, Campus de Las Lagunillas s/n, 23071 Jaen, Spain
3
Research Group (HUM604), Development of Lifestyles in the Life Cycle and Health Promotion, University of Huelva, Campus de El Carmen s/n, 21007 Huelva, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(6), 3341; https://doi.org/10.3390/su13063341
Submission received: 29 January 2021 / Revised: 9 March 2021 / Accepted: 15 March 2021 / Published: 18 March 2021

Abstract

:
Universities strive to ensure quality education focused on the diversity of the student body. According to experiential learning theory, students display different learning preferences. This study has a three-fold objective: to compare learning styles based on personal and educational variables, to analyze the association between learning styles, the level of academic performance, and consistency of performance in four assessment methods, and to examine the influence of learning dimensions in students with medium-high performance in the assessment methods. An interdisciplinary approach was designed involving 289 psychology, early childhood education and primary education students at two universities in Spain. The Learning Style Inventory was used to assess learning styles and dimensions. The assessment methods used in the developmental psychology course included the following question formats: multiple-choice, short answer, creation-elaboration and an elaboration question on the relationship between theory and practice. Univariate analysis, multivariate analysis, and binomial logistic models were computed. The results reveal Psychology students to be more assimilative (theoretical and abstract), while early childhood and primary education students were evenly distributed among styles and were more divergent and convergent (practical) in absolute terms. In addition, high scores in perception (abstract conceptualization) were associated with a high level of performance on the multiple-choice tests and the elaboration question on the relationship between theory and practice. Abstract conceptualization was also associated with medium-high performance in all assessment methods and this variable predicted consistent high performance, independent of the assessment method. This study highlights the importance of promoting abstract conceptualization. Recommendations for enhancing this learning dimension are presented.

1. Introduction

Achieving teaching excellence in higher education is one of the priority areas of the European University Teaching Plans [1]. Specifically, the analysis of teaching-learning processes has emerged as one of the most relevant topics in educational psychology. Evidence suggests that not all individuals learn in the same way [2,3]. Consequently, to ensure education based on sustainable development, we must consider individual differences, as well as develop potential educational interventions for students who display learning difficulties. To do this, studies should be designed to provide insight into the learning processes of students in order to develop proposals for student-centered teaching. The match between teaching methods and student learning style can encourage meaningful learning of content and academic performance [4,5]. Similarly, to evaluate performance, it is important to include a variety of assessment methods to provide students with the opportunity to demonstrate their achievements through different types of tests. Student learning style, teaching practices, and performance assessment form an educational triangle in which the fit among dimensions will foster excellence in educational processes.
In line with promoting teaching excellence [6], efforts have been made in recent years to include the sustainable development approach to education within universities. Among the many objectives in the 2030 United Nations Agenda is the promotion of equitable and quality education focused on the strengths and needs of students to promote meaningful learning. In addition, sustainable education emphasizes empowering students in their professional roles to be responsive and make ethical contributions to society [7,8]. In pursuit of these objectives, this study concentrates on students, their different learning styles, and their specific and overall performance in different assessment methods. The purpose of this study is to examine two key educational components: the learning styles of psychology and education students, differentiating between primary education (Prim. Ed.) and early childhood education (E.C. Ed.) students, and their performance in different assessment methods in order to identify the strengths and specific needs of university students.

1.1. Learning Styles: Experiential Learning Theory

The analysis and assessment of learning styles has been viewed through multiple theories [9,10,11,12,13]. Learning styles can generally be defined as the different relatively stable cognitive, affective, and physiological preferences used by the learner to approach and internalize a content or to respond to a learning situation in formal, non-formal and informal contexts [11,13]. Focusing on education, when learners are involved in instructional settings, they use certain skills to manage and interpret the educational process [14]. Both personal dimensions such as extroversion or introversion and cognitive dimensions including perception, representation, processing, or understanding of content have been used to determine learning style [11,15,16].
According to experiential learning theory [15], learning is the result of first grasping then transforming an experience. The diversity of responses in this process leads to a range of learning styles. Experiential learning theory is based on constructionist principles of education and conceptualizes learning as a holistic process involving the learner as the main protagonist of teaching-learning situations. Through this process, the learner discovers, creates and builds his or her knowledge and experiences guided by the expert [2]. In education, the teacher must understand the student as the sum of his or her cognitive skills, which can be trained, and the personal characteristics that affect learning styles such as how the individual feels, thinks, perceives the world and acts. Indeed, experiential learning theory holds that the learning process is cyclical or spiraling, in that during this process, the learner feels, reflects, thinks, and acts. However, each person learns differently due to the differential importance the learner attaches to each component [17].
Within this approach, learning styles are the result of the combination of two fundamental dimensions: the perception of a specific material and the processing of the information or experience by the learner [17]. On the one hand, perception is defined as the way the learner understands contents and experiences. Learners may show a tendency to focus on concrete sensations, data, experiences, and examples (concrete experience, CE), or they may prefer abstract conceptualization (AC) centered on the ability to understand theoretical relationships and classifications without needing concrete elements of information or the here and now. These two extremes (CE versus AC) are viewed as a continuum. On the other hand, it is believed that individuals use different strategies to transform, incorporate, and give meaning to perceived content [17,18]. Thus, information processing arises from the balance between the individual’s preference to perform active experimentation (AE) with contents and experiences or to internalize and transform contents through reflective observation (RO), considering different perspectives [2,19]. Similar to perception, information and experience processing is not a dichotomous measure, but a continuum between AE and RO. These learning dimensions form the basis of Experiential Learning Theory, identifying four learning styles: accommodating, diverging, assimilating, and converging.
Learners with an accommodating style (CE and AE) are the most practical and tend to focus on specific aspects of content and experience that are processed through AE. The diverging style (CE and RO) is characterized by a predilection for content and concrete situations that are transformed and processed through cognitive analysis from multiple perspectives. By contrast, the assimilating style (AC and RO), the most theoretical, involves a preference for abstract concepts such as the study of psychological theories learned through RO without focusing on practical value. Finally, convergent learners (AC and CE) also show a preference for abstract contents that gain meaning through active and practical experimentation following a deductive method [17].
Learning styles have been associated with a wide variety of dimensions, some of the most common being personal and educational variables [15]. First, with regard to gender, the evidence is inconsistent. Some of Kolb’s original studies found that men have a more abstract perception than women, but that, nevertheless, there are no differences associated with processing [2]. These results have not been confirmed by subsequent research, which suggests that the difference in AC is due to the interaction between variables and cannot be explained by sex alone [2,20]. Second, the influence of age on learning styles has been widely studied. Overall, the evidence indicates that learning styles are stable at different ages, although a tendency towards AC is found in middle age [2]. Despite the stability in preferences, learning styles can be modified by training the different dimensions, for example, through training in AC, using a more concrete perception, promoting active experiences in learning contexts or providing spaces for reflection and analysis in information processing [8,21,22].
Experiential learning theory emerged within the formal context, applied in higher education with the aim of promoting teaching-learning processes centered on the student and his or her characteristics. Learning styles have been studied in disciplines including engineering, medicine, nursing, physiotherapy, humanities, business, language, communication, psychology, and education, among others [2,20,22,23,24,25]. In education and psychology, reference data from experiential learning theory show that education students tend to exhibit an accommodating, i.e., practical, style with perception focused on concrete experiences and active information processing, while psychology students exhibit a divergent style with higher scores on the CE and RO dimensions [2,17]. When comparing the learning styles of psychology and education students with the preferences of students in other degrees, on the one hand, we find that education students are the most accommodating (very high scores in active processing) followed by communication students [2]. On the other hand, psychology students, who demonstrate divergent styles, show high scores in reflective processing as do literature and language students. Additionally, other degrees, such as engineering, mathematics, or medicine, show completely opposite scores with a preference for the convergent style [2,20]. Thus, although considerable similarities exist between students within the same degree, ensuring that teachers are familiar with the most common learning styles in their class and specifically with individual preferences can facilitate the design of teaching-learning processes that maximize the potential of each student.
To summarize, recognizing a student’s learning style can have important educational implications and provide guidance to optimize teaching-learning processes [3,26]. Teachers can tailor their teaching practices to maximize strengths and meet student needs [27], and students can develop meta-cognitive processes and greater control over their learning and performance.

1.2. Assessment Method, Academic Performance and Learning Dimensions

Overall, academic performance is the extent to which a student has acquired the planned knowledge in a teaching-learning setting that students routinely demonstrate in curriculum-based assessments [28]. Collaborative group projects, individual assignments or tests are some of the most commonly used modalities to measure student learning [29]. In undergraduate degrees in psychology and education, various assessment methods are frequently combined, although it is common to have a final performance examination in almost all courses. Nonetheless, objectively assessing the knowledge acquired continues to be a challenge in current teaching plans [30]. The use of closed or open-ended questions is frequent in disciplines such as psychology and education. Closed questions may include multiple choice questions (MCQ) or short questions (although this typology may also occasionally include open-ended questions). The MCQ consists of a theoretical or practical question and a limited number of related answer items. Students must choose the most appropriate answer from the given options. The MCQ requires students to perform relationship and discrimination processes between response alternatives to make the final selection of an option. In this assessment method, the student’s processes of recognition memory are essential, with the question statement acting as the instruction for recognition, while the incorrect answer alternatives act as distractors [31,32]. Short questions (SQ) can be of different types: single-best answer, concept identification, briefly describing a theory, or classifying a set of constructs based on common aspects. Some SQ involve briefly writing a theory, an example, or a definition, or naming a theory from the definition. Unlike MCQs, this assessment method enhances recall processes and knowledge application, as opposed to cognitive processes of recognition with distractors [31].
Complementarily, open-ended questions are another form of assessment. Open-ended questions generally require different simultaneous processes such as access to recall memory, use of critical thinking, making abstract associations, planning, content synthesis and writing [15,33]. Linking concepts or theories, solving a practical or clinical case, proposing an intervention or making associations between theory and practice and vice versa are some of the most common types of open-ended questions. One of the challenges of open-ended assessment questions is identifying objective correction criteria since response variability is greater than in closed questions [34]. Nevertheless, open-ended questions allow the student to initiate cognitive elaboration processes, construction of content, and appear to measure deeper learning than the MCQ [33,35].
The nature of each assessment method has implications for teaching-learning processes. Student performance may vary depending on the evaluation method used. Some studies suggest that students find open-ended tests more difficult because they are forced to use more elaborate and in-depth learning strategies than with MCQ tests [30,33]. Urda and Ramocki [30] also found that students prefer memory-based assessment methods to tests of analysis, creativity, or practical applications. There is controversy in the literature as to whether different assessment tests measure the same knowledge. Some reviews highlight the consistency of performance among assessment methods [36,37]. However, other authors point out that different assessment methods involve qualitatively diverse learning methods. The evidence suggests that in closed questions, students tend towards using surface learning strategies, while in open-ended questions they adopt a deep learning approach. Thus, performance will be influenced by learning strategies, learning preference, and assessment method [33,38].
Student performance is not only determined by the assessment method. Studies and meta-analyses in the university context suggest that performance is affected by a large number of variables such as emotional attention, motivational self-efficacy, prior academic performance, student motivation, goal-directed learning strategies, and teaching style and practices such as the use of feedback [39,40]. Evidence also shows an association between learning styles and performance [4,17,22,41]. Several authors have studied this relationship in multiple assessment methods [17,21,42,43]. Just as students approach learning tasks with certain perception and processing preferences, they will also approach assessment tasks with certain predilections. Therefore, unquestionably, student learning styles may involve a preference for certain assessment methods that lead to differences in performance [21,42,44].
Regarding learning dimensions and styles, when students prefer CE and a practical processing (accommodating style), they typically obtain better results primarily in applied tasks unrelated to theory. These students are able to distance themselves from analysis and theory and focus on practical activities based on concrete aspects of the contents [17]. Students with an accommodating style usually perform well in cooperative work, project-based teaching methodology, and assessment methods that are not mediated by theoretical considerations, such as creating applied activities or SQ that do not involve comparisons between concepts. Nonetheless, some evidence points to poorer academic performance in students with a tendency towards active learning [21,44]. In fact, a study conducted in Spain with engineering students showed that students with a tendency towards active learning show poorer results compared to more theoretical students [22]. Alternatively, students with divergent styles (preference for CE and RO) are attracted to group work where multiple perspectives can be analyzed for the construction of content. These students also prefer going deeper into contents while also being creative [17]. Concerning assessment methods, these students analyze and process information on two levels: concrete perception and reflective processing. They do well on performance tests that involve connecting the practical with the theoretical, such as elaboration tests and linking content to practice and theory, or MCQ tests, which require identifying the correct answer from different options.
Continuing with learning dimensions and styles, students who are more drawn to theory, tending towards AC and RO (assimilating style), typically perform better in theoretical exercises far from concrete aspects and active experience. These students therefore prefer teaching-learning processes focused on analyzing readings, exploring contents, or creating theoretical models [17]. Evidence suggests that a reflective learning style is associated with high performance [21,22]. Despite the more theoretical profile of assimilative students, some studies have shown that they tend to perform well on tests that involve linking practical aspects to theory, as well as in assessment methods that involve generating new concepts [42,45]. MCQs requiring the student to reflect on and compare different theories can enhance performance in these students. Indeed, Gargallo and colleagues [22] compared students from various degrees with excellent and average results and found that high-performing students prefer a reflective and theoretical style, while low performing students favor more active styles.
Finally, the learner with a convergent style (preference for AC and AE) more easily develops practical implications that emerge from theory. These learners are typically decisive and often propose practical solutions based on theory. These students apply theory as a basis for solving practical problems [17]. Some studies suggest that students with higher scores on abstract concepts usually perform better on MCQ tests [17,42]. Indeed, AC and the convergent style have been found to be dimensions commonly associated with high performance on various assessment tests [43,46]. Accordingly, students with a clearly abstract perception and active processing perform on two levels of analysis. They are initially more theoretical and subsequently more practical, and can therefore approach open-ended question assessments with potentially positive results that allow the student to draw relationships between theory and practice.
In summary, although the relationship between performance and learning styles should be further explored, the integration of evidence suggests the importance of AC and RO as dimensions associated with high performance in complex assessment methods such as MCQ or open-ended questions.

1.3. Present Study

How university students learn undoubtedly has a strong influence on teaching-learning processes. This study intends to go one step further to promote educational development by considering individual differences among students. Students enrolled in Education degrees have traditionally been studied as a single group [2,17]. In this study, however, we explore separately the learning dimensions of two specialized education degrees: Prim. Ed. and E.C. Ed., as well as psychology. This study is one of the first attempts to provide insight into the characteristics of E.C. Ed. students. We also focus on the association between assessment methods, learning styles and performance. Most research has associated learning styles and performance as measured by a single assessment test [4,22]. This study aims to conceptualize the possible implications of using different assessment methods involving open-ended and closed-ended questions. In addition, this research endeavors to go beyond the relationship between learning styles and performance by also evaluating the congruency between performance in different assessment methods and learning preferences. In line with the goal of achieving a university education based on excellence, tailored to the needs of students and ensuring equal opportunities [1,6], the purpose of this study is to identify the learning preferences of high-performing students, as well as to detect dimensions that may improve the results of lower-performing students in order to provide recommendations to teachers.
This study has a threefold objective: (a) to describe and compare learning dimensions (CE, AC, AE, RO, perception, and processing) and learning styles (accommodating, diverging, assimilating, and converging) according to gender, age, and university degree: psychology, E.C. Ed. and Prim. Ed.; (b) for each assessment method (MCQ, SQ, creation-elaboration questions and elaboration question on the relationship between theory and practice), examine the association between learning dimensions and styles, academic achievement level (high, medium and low) and consistency of performance among the assessment methods; and (c) explore the influence of the learning dimensions on students in terms of the level of performance.
In line with previous evidence, with respect to our first objective, we hypothesize that gender and age will not influence learning dimensions, while we anticipate that education students will display a tendency towards CE and AE, and psychology students towards CE and RO. Due to a lack of previous data, we cannot propose possible differences between Prim. Ed. and E.C. Ed. students. With regard to our second objective, we hypothesize that there will be differences between learning dimensions and styles and level of performance in each assessment method. In methods such as MCQ or the elaboration question on the relationship between theory and practice, we believe that high-performing students are likely to show a preference for AC and RO. We also anticipate that in purely practical methods, such as creation-elaboration, there will be no relationship between learning dimensions and performance. In addition, given the characteristics of learning dimensions and styles, we believe there may be inconsistencies in performance depending on the assessment methods. Specifically, we hypothesize that students with higher scores in AC and RO will be more consistent and demonstrate successful academic performance in various assessment methods. Finally, for the third objective, we propose that the perception dimension, which integrates AC, will have the ability to predict consistent successful performance, independent of the assessment methods used.

2. Materials and Methods

2.1. Participants

The study was carried out during the 2017/2018 and 2018/2019 academic years at two different universities in southern Spain, the University of Seville (US) and the University of Jaen. A total of 289 undergraduate students enrolled in degrees in psychology (n = 64), early childhood education (n = 77), and primary education (n = 148) were selected to participate in the research through purposive, non-probability sampling. These students were taking a developmental psychology course common to the three degrees. All students were in their first or second year of university. A total of seven groups were assessed: two psychology groups, two E.C. Ed. groups and three Prim. Ed. groups. Women made up 80.62% of the sample and men 19.38%. The mean age of the sample was 20.53 years (SD = 3.20).

2.2. Instruments

Sociodemographic profile. The questionnaire included information related to sociodemographic dimensions such as sex and age, and to educational dimensions including degree and year of university enrollment.
Learning styles and dimensions. Version 3.1 of the Learning Style Inventory (LSI-III) [17] was used, adapted and validated in Spanish [47]. This instrument comprises 12 items with four response alternatives. Respondents must classify their answers in an ipsative manner (from 1 to 4) for each statement. Each response option corresponds to a different learning preference: AE (α = 0.70), RO (α = 0.68), AC (α = 0.73), and CE (α = 0.69). Using the total score for each dimension, two different opposite dimensions are obtained: perception (the AC score is subtracted from the CE score) and processing (AE—RO). The combination of these two learning dimensions results in four learning styles. According to the cut points recommended by the authors of the scale [17], the following learning styles emerge: accommodating (perception ≤ 7; processing ≥ 7), diverging (perception ≤ 7; processing ≤ 6), converging (perception ≥ 8; processing ≥ 7) and assimilating (perception ≥ 8; processing ≤ 6).
Academic performance. The final content assessment comprised an examination to quantify the academic performance of the students. Different evaluation methods were integrated in this final assessment. For each degree, the same content was evaluated through the following assessment methods:
  • Multiple choice questions (MCQ). The contents were evaluated through multiple choice tests. Using this closed question method, students had to identify a single valid response among four alternatives. The following is an example of an MCQ used in the assessment: “The increase in explicit memory in children is associated with: (a) the autonomy of the child when learning to walk and handle objects; (b) an increase in the density of synapses at four months; (c) an increase in the density of synapses at eight months (correct answer); (d) the importance of holophrases in children to store memory”. A total of 30 MCQ were used with a final score of 0 to 10.
  • Short questions (SQ). This method used short and closed questions. In this test, the student is given a statement in order to identify a concept. An example is given below: “Identify the developmental theory that states that the zone of proximal development refers to the distance between the actual level of development and the level of potential development”. In this case, the correct answer would be: Vygotsky’s Sociocultural Theory. A total of 10 SQ were used with a final score ranging from 0 to 10.
  • Creation-elaboration questions (CEQ). In this open-ended question, students must create a practical activity based on the contents studied in the developmental psychology course. This method involves mainly a practical question. The CEQ used in the tests was: “Create an activity to work on the understanding other people’s emotions in a class of 5-year-olds”. With a score between 0 and 10 points, the structure of the activity, creativity, and suitability with the contents of the course were used as criteria to evaluate performance on the CEQ.
  • Elaboration Questions on the Relationship between Theory and Practice (EQRTP). In this open-ended assessment question, students must link theoretical concepts to practical application. A video is used to illustrate a teaching-learning process between children and an expert, and various interactions between children. After watching the video, students must analyze its content using theoretical concepts. Below is the examination question and a link to the video used (https://www.dropbox.com/s/xux6p9di5pj885m/Sustainability.mp4?dl=0) (accessed on 29 January 2021) [48] “Associate the following video with theoretical contents of the Developmental Psychology course”. The number of associations between practical aspects and theoretical contents, coherence between concepts, presentation of theory and clarity in the written composition were used as criteria to assess performance from 0 to 10.

2.3. Procedure

This research forms part of a teaching innovation project of the Department of Evolutionary Psychology and Education, financed by the US from 2017 to 2019. The aim of this project is to optimize teaching-learning processes in the university setting through teacher training based on the current needs of the student body [49].
The data collection procedure was similar for all three degrees. Learning styles were assessed in all students who took the final examination for the Developmental Psychology course. In addition, to systematize the process, the same teacher was in charge of the teaching-learning process (researcher of this study) in all the groups. Prior to the start of the study, the students were informed of the objectives of the study on learning styles and performance, as well as its voluntary nature. The confidentiality of the data and respect for participants and human rights were guaranteed, in compliance with the guidelines of the Declaration of Helsinki. Informed consent to use the data for research purposes was signed by all participants. This study was carried out according to the ethical principles of anonymity, responsibility, and equality in the framework of a teaching innovation project financed and approved by the US (Resolution 11 October 2017, “Support for Teaching Coordination and Innovation”, Department of Developmental and Educational Psychology, US).
Both instruments were administered on the same day, first the LSI-III then the performance assessment. The instructions for completing the LSI-III were explained in more detail on the day of the assessments, placing special emphasis on the non-evaluative nature of the questionnaire in order to avoid response bias. The instruments were administered in person, in written format (paper-based). The total duration of the evaluation was about 135 min: 15 min for the sociodemographic profile and the LSI-III questionnaire and 105 min for the content assessment in sequential order: MCQ, SQ, CEQ and EQRTP.

2.4. Data Analysis

We conducted an interdisciplinary, cross-sectional, observational and quantitative study using the LSI-III, a psychometric survey validated in multiple countries and contexts [17]. Analyses were performed with SPSS software, version 26 [50]. Before the statistical analyses, several tests were conducted. First, missing data for items were studied with missing value analysis. Little’s MCAR test was used to confirm random data distribution. Missing data were less than 5% per item and missing items less than 10% per scale, following a random distribution. Accordingly, structural equation modeling was used to impute data with the expectation-maximization algorithm in SPSS. Twelve students were eliminated due to not answering ipsitively on the LSI-III. Regarding atypical and influential cases, both univariate and multivariate cases were identified. Subsequently, we checked the necessary prior assumptions for the use of parametric tests in the univariate and multivariate analyses.
With regard to the study variables, academic performance was divided into three levels: high, medium, and low. For this classification, each student’s performance was categorized into a level (high, medium, or low) for each assessment method, using as a reference the accumulated percentiles and their tertile position in their class group. For the consistency variable, three levels were established (consistent students with medium-high performance, consistent students with medium-low performance, and inconsistent students). Consistent students performing at the medium-high level fall in the first and second tertiles on all assessment methods. Consistent students performing at the medium-low achievement level are students performing in the second and third tertiles in all assessment methods. Finally, inconsistent students are those who score high in at least one assessment method and score low in at least one other.
For the univariate analyses, the descriptive statistics are presented using means and standard deviations and frequency and percentages, thereby describing their central tendency and variability. For the bivariate analyses, ANOVA was calculated using Snedecor’s F statistic, Bonferroni’s post hoc test, and contingency tables.
For the multivariate analyses, binomial logistic models were computed, considering Hosmer–Lemeshow chi-square distribution to calculate goodness of fit, as well as the correct classification rate of the observed and predicted subjects of the resulting model. Nagelkerke’s pseudo-R2 statistic was used to assess the degree of explanation of the resulting model. After creating the model and satisfactorily confirming its viability, the meaning and direction of the coefficients were examined through the Wald statistic and odds ratios.

3. Results

3.1. Learning Styles According to Personal and Educational Dimensions

Taking into account the main learning dimensions, overall, the students achieved a mean score of 5.23 (SD = 9.94, Min. = −26, Max. = 32) in perception. Thus, AC had a mean score of 31.09 (SD = 6.01, Min. = 15, Max. = 46), higher (t (288) = −8.95, p < 0.001) than that obtained in CE of 25.86 (SD = 5.78, Min. = 13, Max. = 48). The mean for processing was 4.83 points (SD = 9.63, Min. = −22, Max. = 24), with AE obtaining a mean of 33.94 (SD = 5.89, Min. = 17, Max. = 47), higher (t (288) = −8.53, p < 0.001) than that found for RO (M = 29.11, SD = 5.63, Min. = 15, Max. = 48). The participants were therefore distributed heterogeneously in terms of learning styles, which were mainly divergent (n = 88, 30.4%), followed by accommodating (n = 86, 29.8%), assimilating (n = 70, 24.2%), and, finally, converging (n = 45, 15.6%). Figure 1 illustrates the distribution of the scores.
The statistics for the distribution of scores by sociodemographic (sex and age) and educational (degree) dimensions are provided in Table 1. As can be seen, the score distributions were similar according to age and sex. Nevertheless, both learning dimensions and learning styles varied by university degree. Thus, according to the post hoc analyses, psychology students achieved higher scores in perception compared to Prim. Ed. students (mean difference [MD] = 4.41, p = 0.008) and E.C. Ed. (MD = 4.78, p = 0.013), especially in AC (Prim. Ed., MD = 3.55, p < 0.001; E.C. Ed., MD 4.57, p < 0.001). Regarding processing, although no differences were seen in this learning dimension, differences were found in AE and RO. Compared to psychology students, primary education students tended to obtain higher scores (MD = 2.23, p = 0.034) in AE and E.C. Ed. students in RO (MD = 2.29, p = 0.048).
Finally, concerning learning styles, in absolute terms, the divergent (theoretical) style predominated in psychology, while Prim. Ed. and E.C. Ed. students showed divergent and convergent (practical) styles. When comparing degrees, we observed a heterogeneous distribution. Bearing in mind the adjusted standardized residuals, the analyses indicated that the psychology students were characterized by more assimilative styles and less divergent and convergent styles. The E.C. Ed. students were uniformly distributed, while the primary education students showed a lower tendency towards the assimilating style.

3.2. Assessment Methods: Academic Performance and Consistency and Their Relation to Learning Dimensions and Styles

Analyses of variance were calculated to examine the relationship between learning dimensions and performance levels for each type of assessment method (see Table 2). While in the processing dimension (AE—RO), no differences were noted in performance on the assessment tests in perception (AC—CE), differences were seen when comparing means on the MCQ and the EQRTP. Specifically, Bonferroni’s post hoc tests on the MCQ assessment method revealed that the lowest performing students showed significantly lower scores in perception compared to the medium-high performance group (MD = −5.76, p = 0.022). In relation to the EQRTP, a trend toward significance was noted. In this case, Bonferroni’s post hoc tests revealed that high-performing students tended to have a higher level of perception than those students with average performance in the open-ended EQRTP question (MD = −3.48, p = 0.075).
With respect to score consistency across the assessment methods, a significant difference was noted between the consistency groups. Students who maintained consistent medium-high performance had a significantly higher perception score compared to students who had consistent medium-low performance (MD = 4.22, p = 0.024).
In terms of learning styles, chi-square tests and standardized residuals showed a significant variation in the EQRTP assessment test. The students who did not have a convergent style performed poorly on this test, while the convergent students achieved high scores.

3.3. Influence of Learning Dimensions on Performance Consistency in the Different Assessment Methods

A binomial logistic regression analysis was performed to analyze the influence of the learning dimensions on the consistency of student performance in the different assessment tests. The perception and processing dimensions were selected as covariates and consistency (medium-high and medium-low performance) as a dependent variable. With a total of 152 students, the classification table of the initial model correctly predicted the position of 51.3%, with the model constant proving non-significant in the absence of the covariates. After entering the covariates, the proportion of variance explained by the model reached 6.1%, yielding a significant change (p = 0.029). Table 3 presents the resulting model showing that the perception dimension significantly explains the distribution of students according to a consistent medium-high or medium-low performance. Specifically, a unit of change in perception increased the possibility of belonging to the consistent medium-high performance group by 4.3%. In this case, the percentage of correct distribution in the final model increased to 59.2%.

4. Discussion

This study aims to provide recommendations for promoting excellence in teaching-learning processes [6,51]. Following the 2030 agenda for sustainable development goals [7], the design of this study focuses on students and how they learn. Given the diversity and specialization of each university degree, we studied separately the learning styles of primary education students and early childhood education students, rather than grouping them together under the broad single category of education as is usually done in university populations [2,17,52]. This is one of the strengths of this study, as it is the first to make this distinction. Moreover, the data not only focus on the association between learning styles and performance, but also further examine the results according to assessment method (closed and open questions) and the consistency/inconsistency of performance among assessment methods.
In line with our first objective, the exploratory results show a trend towards an accommodating and diverging learning style, when considering the sample as a whole. These learning styles show a common tendency towards CE in the perception dimension. These results support those obtained in the application of experiential learning theory in higher education with psychology and education students, who, as a whole, tend to score high in the perception dimension, preferring divergent (psychology) and accommodating (education) learning styles [2,17]. However, when describing and comparing learning styles between university degrees, the results show that psychology students are predominantly assimilators—that is, they are very theoretical students (tendency towards AC and RO). This evidence is partially in contrast to the findings of the original authors of experimental learning theory, who considered psychology students to be divergent. These results may be explained by the curricular differences between the initial psychology courses in North America and Europe [53]. Beginning in the first years of study, the curricula of European universities include, to a greater extent, specific courses in which different theoretical models are contrasted. Among Prim. Ed. and E.C. Ed. students, learning styles were quite consistent. Previous evidence suggested a mainly accommodating, i.e., practical, learning style for Prim. Ed. students [2]. These data are partially reproduced here, although the diverging style also emerges in balance with the accommodating style. When comparing scores between degrees, psychology students showed a significantly higher score in the perception dimension (higher AC) than E.C. Ed. and Prim. Ed. students. No differences were observed in the processing dimension, although psychology students did show a higher score in RO. These results partially confirm our hypothesis of finding higher RO among psychology students, which is associated with a greater ability to transform theoretical contents [2]. Consequently, the findings of this research differ in part from previous evidence, generally from the North American context. Therefore, this study raises a new question to be addressed in further research on whether culture and educational planning throughout developmental progression is influencing learning styles. Indeed, the original authors of experiential learning theory have acknowledged that cultural influences can affect learning styles [2,15]. In the sociodemographic dimensions, no significant differences were found according to the sex or age of the students. These data support the first hypothesis of the study and are consistent with previous research [2,54].
A central aspect of this study was the analysis of the possible differences between Prim. Ed. students and E.C. Ed. students. In general terms, the distribution of learning styles is similar with only one difference. E.C. Ed. students tend to be more assimilative than Prim. Ed. Students, who are more convergent. E.C. Ed. students tend towards more theoretical processing than Prim. Ed. students. According to previous studies [2,15], Prim. Ed. students tend to prefer more active processing of content versus reflective processing. However, the profile of E.C. Ed. students shows they tend to be more diverse in their learning styles. This study is the first to provide evidence concerning learning preferences in E.C. Ed. and shows the relevance of a detailed analysis of individual academic degrees without generalizing the data for all education degrees, as has been the case until now.
With regard to our second objective, the associations between learning dimensions and level of academic performance were examined for each assessment method: closed questions (MCQ and SQ, in this case) and open-ended questions (CEQ and EQRTP). Contrary to our initial hypothesis, high-achieving students showed no significant differences in processing (AE and RO) in any of the four assessment methods. These results do not coincide with other results in the Spanish context, possibly because the most rigorous studies in Spain to date have focused on engineering students whose exams are more numerical in nature and very different from those of psychology and education students [22]. However, the results partially support our initial hypothesis, in that the tendency towards AC in perception is associated with high performance levels in the MCQ and the EQRTP. Findings from previous studies are partially consistent with our results. Specifically, theoretical students are associated with better performance in a variety of assessment tests [33,43,46]. Therefore, completing tasks that involve complex cognitive processes can improve performance in students with a preference for forming abstract relationships. Accordingly, in this study, the closed questions posed (MCQ and SQ) involve different cognitive processes. The MCQ favors recognition memory and also influences establishing relationships between response alternatives in order to identify the correct option [32]. In contrast, the SQ focus on naming a specific concept. It is therefore consistent that AC is a discriminating variable for high performance as the difficulty of the assessment method increases.
Similarly, elaboration questions tend to be less preferred by students [35]. In this case, two types of SQ are posed: one involving the elaboration of an activity without establishing explicit relationships with theoretical elements, and a second in which, through visualizing a teaching-learning situation, students are asked to perform a theoretical analysis of the situation depicted. Consistent with closed questions, high-performing students show higher scores in AC. Again, this is a complex assessment method, which involves a process of observing a practical situation, analysis, associating theory and practice, planning, and elaborating content, facilitated by high scores in AC and a convergent style. Previous studies have revealed that convergent students often achieve higher levels of performance similar to our results [43,46]. Perhaps these students engage with ease on two levels, both in theoretical and practical analysis, to establish two-way paths between theory and practice, which would enhance their performance on tests requiring simultaneous analysis and cognitive processes. Consequently, the data reveal that it is not only important to pay attention to the difference in results between closed and open-ended questions [33,34], but that within each assessment method, there may be different types of tests with different implications depending on the complexity of the assessment.
Considering the differences in the learning dimensions of consistent/inconsistent student performance, the results support our hypothesis that scores for perception were significantly higher, tending towards AC, in consistent students with medium-high performance compared to consistent students with medium-low performance. In fact, this study provides partial evidence in favor of studies that show evidence of better performance by students who tend to be more theoretical and poorer results among students who tend to be more practical [22]. However, we should not make reductionist interpretations given that in many cases, assessment methods favor students with a more theoretical approach to learning, such as assimilating students. This would explain why in some studies the more theoretical students show better performance compared to the results obtained in this research [21,22]. Inconsistent students (high performance in one assessment method and low performance in another method) had mean scores between consistent medium-high and consistent medium-low performers. That 31% of students showed inconsistent performance across assessment methods highlights the diversity in approaches to tasks. Previous evidence has been controversial on this issue [36,55,56]. This study supports claims that student performance can be influenced by assessment methods [33,38]. A possible bias of the study was the influence of the first assessment method (MCQ) on the subsequent ones [57]. That nearly one third of the sample showed inconsistency in their level of performance, as well as that the order of presentation of the assessment methods was similar in all groups, may imply a low impact of testing effect. These results provide support regarding teachers using different evaluation methods. The choice of assessment method can influence student performance according to their learning preferences. Moreover, using a variety of assessment methods can help students with different learning profiles to demonstrate their competencies effectively. For example, open-ended questions require students to make greater use of recall memory, planning strategies, and the elaboration of abstract relationships that must be conceptualized in question wording [15,33], whereas MCQ involves processes of choice analysis associated with recognition memory [31].
Finally, regarding the third objective, our starting hypothesis considered the perception dimension as a potential predictor of successful performance regardless of the assessment method used. The data showed that, in line with our hypothesis, high scores in the perception dimension (higher AC) significantly explain students’ belonging to the consistent medium-high performance group in all assessment methods. However, the processing dimension was not predictive of the potential consistency between assessment methods and performance in each method. Ultimately, in contrast to previous studies that stress the importance of RO [21,22], our findings show that AC is the variable associated with successful performance irrespective of whether the student’s preference for establishing conceptual relationships is subsequently complemented by AE or reflective analysis.
The main limitation of this study is the extent of the interaction between academic performance and assessment methods, which would have been greater had there been a greater number of participants per university degree, as well as additional degrees. In this line, one factor that could affect the external validity of our study is the composition of the sample. On the one hand, in this study, the sample is overrepresented, especially by Prim. Ed. students, and it was composed mainly of women. Although this is a limitation, the distribution between men and women in these degrees in Spain (psychology and education) follows a trend of unequal enrollment [58]. Therefore, complementing the study with degrees in which more male students are enrolled may enhance the interaction and effects between educational and sociodemographic dimensions according to experiential learning theory. On the other hand, the participants in this study come from degrees in which the study of learning processes is a central topic.
Future lines of research include other dimensions related to the teaching-learning situation such as study time or motivation, or aspects such as exam anxiety. Likewise, another future line is to incorporate the effect of different teaching methodologies on the associations found, for example, collaborative methods or the use of virtual environments [59,60,61]. Teachers should also be involved in the study of learning styles. Defining the relationship between learning styles within the teacher-centered educational process may generate tools for the dynamic assessment of students, taking into account all educational agents [51,62]. At the same time, the distributed and long-term assessment of learning styles and dimensions during the academic degree would enable the optimization of sequential resources to help achieve the goals of sustainable development in education, meeting the specific needs of the student throughout the curricular process. Additionally, to confirm a theoretical model that associates learning styles and performance according to the assessment method, studies are needed in different areas of knowledge such as engineering, art, and health sciences, among others. These studies would make it possible to determine the relationship between the variables studied (learning styles, performance, and assessment methods), establishing their independence or dependence as a function of the teaching content.

5. Conclusions

The present study provides evidence for the use of teaching-learning processes based on the identification of learning styles in university students to promote sustainable, holistic, and individualized education. The data show that there are different learning preferences in the three degrees evaluated: psychology, primary education and early childhood education. Thus, psychology students stand out for their abstract perception of the contents and their more theoretical orientation, Prim. Ed. students for their tendency to actively process the contents, while E.C. Ed. students show a very diverse profile in their learning styles and information processing that tends to be more theoretical compared to their peers in Prim. Ed. In addition, this study emphasizes the influence of the different learning dimensions (specifically, AC and converging style) on academic performance according to the various assessment methods used. Previous evidence has associated learning styles with academic performance [17,22]. This study, however, provides one of the first contributions to associate learning dimensions and academic performance in four assessment methods: two open-ended assessments and two closed-ended assessments. Different results are found for each assessment method. Accordingly, high scores in perception and AC in particular are more likely to be significant in complex assessment methods: MCQ and open-ended questions linking practical experiences with theory. Therefore, regardless of whether the assessment method corresponds to closed-ended questions or open-ended questions, the increased difficulty of the assessment seems to be addressed more successfully by those students with higher scores on AC. Similarly, this study finds that students can have consistent medium-high academic performance, consistent medium-low performance, or inconsistent scores according to the assessment method. In fact, almost one third of the student sample shows inconsistent performance depending on the evaluation method. This supports the evidence proposed by other authors that academic performance may vary according to the assessment method used by teachers [33,38]. Therefore, the implications of using different assessment methods should be explored. In this line, AC has been revealed as the dimension that predicts medium-high performance in all assessment tests.
The implication of this study is not only descriptive. The aim is to ensure that assessment methods are consistent with teaching processes in university education and sensitive to learning styles. A proven shortcoming is that the use of assessment methods that are inconsistent with individual teaching processes can lead to student academic performance that does not correspond to the actual learning acquired. Given the importance of AC, teachers in both undergraduate education and higher education should encourage training in this dimension. Activities that can be implemented in the educational setting to strengthen the abstract processes of perception include the use of concept maps, associating texts with theories, producing syntheses and generalizations, establishing hierarchies and classifications, identifying similarities and differences between theories, and linking them with practice.

Author Contributions

Funding Acquisition, J.M.; Data collection, J.M. and J.F.L.; Conceptualization, J.M., J.F.L. and J.P.-P.; methodology, J.M., J.F.L. and J.P.-P.; formal analysis, J.M., J.F.L. and J.P.-P.; investigation, J.M., J.F.L. and J.P.-P.; writing—original draft preparation, J.M., J.F.L. and J.P.-P.; writing—review and editing, J.F.L.; supervision, J.M. and J.P.-P. All authors have read and agreed to the published version of the manuscript.

Funding

The University of Seville has financed the teaching innovation project of which this study is a part.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by University of Seville (Resolution 11 October 2017, “Support for Teaching Coordination and Innovation”, Department of Developmental and Educational Psychology, US).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Acknowledgments

We wish to thank all the students for their participation in this study. The authors would like to thank María Repice for her help with the English revision of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cardoso, S.; Rosa, M.J.; Stensaker, B. Why is quality in higher education not achieved? The view of academics. Assess. Eval. High. Educ. 2016, 41, 950–965. [Google Scholar] [CrossRef]
  2. Kolb, A.Y.; Kolb, D.A. The Kolb Learning Style Inventory 4.0: A Comprehensive Guide to the Theory, Psychometrics, Research on Validity and Educational Applications; Hay Resources Direct: Boston, MA, USA, 2013. [Google Scholar]
  3. Tormo-Carbó, G.; Seguí-Mas, E.; Oltra, V. Business ethics as a sustainability challenge: Higher education implications. Sustainability 2018, 10, 2717. [Google Scholar] [CrossRef] [Green Version]
  4. Ghazivakili, Z.; Norouzi Nia, R.; Panahi, F.; Karimi, M.; Gholsorkhi, H.; Ahmadi, Z. The role of critical thinking skills and learning styles of university students in their academic performance. J. Adv. Med. Educ. Prof. 2014, 2, 95–102. [Google Scholar]
  5. Hendry, G.D.; Heinrich, P.; Lyon, P.M.; Barratt, A.L.; Simpson, J.M.; Hyde, S.J.; Gonsalkorale, S.; Hyde, M.; Mgaieth, S. Helping students understand their learning styles: Effects on study self-efficacy, preference for group work, and group climate. Educ. Psychol. 2005, 25, 395–407. [Google Scholar] [CrossRef]
  6. Deem, R.; Baird, J.A. The English Teaching Excellence (and Student Outcomes) Framework: Intelligent accountability in higher education? J. Educ. Chang. 2020, 21, 215–243. [Google Scholar] [CrossRef] [Green Version]
  7. United Nations. Transforming Our World: The 2030 Agenda for Sustainable Development; United Nations Population Fund: New York, NY, USA, 2015. [Google Scholar]
  8. Watson, M.K.; Pelkey, J.; Noyes, C.; Rodgers, M.O. Using Kolb’s Learning Cycle to Improve Student Sustainability Knowledge. Sustainability 2019, 11, 4602. [Google Scholar] [CrossRef] [Green Version]
  9. Alonso, C.; Gallego, D.; Honey, P. Los Estilos de Aprendizaje. Procedimientos de Diagnóstico y Mejora; Mensajero: Bilbao, Spain, 1997. [Google Scholar]
  10. Kalleberg, A.L.; Dunn, M. Institutional Determinants of Labor Market Outcomes for Community College Students in North Carolina. Community Coll. Rev. 2015, 43, 224–244. [Google Scholar] [CrossRef]
  11. Kolb, D.A. Experiential Learning: Experience as The Source of Learning and Development. Prentice Hall Inc. 1984, 20–38. [Google Scholar] [CrossRef]
  12. White, R.K.; Jantrania, A. Computer program design for land treatment systems. Biocycle 1989, 30, 66–67. [Google Scholar]
  13. Keefe, J.W. Profiling and Utilizing Learning Style; NASSP Learning Style Series; NASSP: Reston, VA, USA, 1988. [Google Scholar]
  14. Myers, I.B. MBTI Manual: A Guide to the Development and Use of the Myers-Briggs Type Indicator; Consulting Psychologists Press: Palo Alto, CA, USA, 1998. [Google Scholar]
  15. Kolb, A.Y.; Kolb, D.A. Learning styles and learning spaces: Enhancing experiential learning in higher education. Acad. Manag. Learn. Educ. 2005, 4, 193–212. [Google Scholar] [CrossRef] [Green Version]
  16. Sadler-Smith, E. Does the Learning Styles Questionnaire Measure Style or Process? a Reply to Swailes and Senior (1999). Int. J. Sel. Assess. 2001, 9, 207–214. [Google Scholar] [CrossRef]
  17. Kolb, A.Y.; Kolb, D.A. The Kolb Learning Style Inventory—Version 3. 1 2005 Technical Specifications; Hay Resources Direct: Boston, MA, USA, 2005. [Google Scholar]
  18. Romero, L.N.; Salinas, V.; Mortera, F.J. Estilos de aprendizaje basados en el modelo de Kolb en la educación virtual. Apert. Rev. Innov. Educ. 2010, 2, 72–85. [Google Scholar]
  19. Sabariego, M.; Regos, R.A.; Villaverde, V.A.; José, M.; Tarazona, C.; Benito, V.D.; Serrano, L.R.; Dolores, M.; González, S. El Pensamiento Reflexivo a Través De Las Metodologías Narrativas: Experiencias de Innovación en Educación Superior; Octaedro: Barcelona, Spain, 2018. [Google Scholar]
  20. Rodríguez, H.; Pirul, J.; Robles, J.; Pérez, L.; Vásquez, E.; Galaz, I.; Cuellar, C.; Díaz, H.; Arriaza, C. Analysis of learning styles in Medical students in the University of Chile. Educ. Med. 2018, 19, 2–8. [Google Scholar] [CrossRef]
  21. Bahamón, M.; Viancha, M.; Alarcón, L.; Bohorquez Olaya, C.I. Estilos y estrategias de aprendizaje: Una revisión empírica y conceptual de los últimos diez años. Pensam. Psicol. 2012, 10, 129–144. [Google Scholar]
  22. Gargallo, B.; Almerich, G.; Suárez, J.M.; García, E.; Garfella, P.R. Learning styles and approaches to learning in excellent and average first-year university students. Eur. J. Psychol. Educ. 2013, 28, 1361–1379. [Google Scholar] [CrossRef]
  23. Gravini, M.; Cabrera, E.; Ávila, V.; Vargás, I. Estrategias de enseñanza en docentes y estilos de aprendizaje en estudiantes del programa de psicología de la Universidad Simón Bolívar, Barranquilla. Rev. Estilos Aprendiz 2009, 3, 124–140. [Google Scholar]
  24. İlçin, M.; Tomruk, N.; Yeşilyaprak, S.S.; Karadibak, D.; Savcı, S. The relationship between learning styles and academic performance in Turkish physiotherapy students. Physiotherapy 2016, 102, e84–e85. [Google Scholar] [CrossRef] [Green Version]
  25. Brown, T.; Cosgriff, T.; French, G. Learning style preferences of occupational therapy, physiotherapy and speech pathology students: A comparative study. Internet J. Allied Health Sci. Pract. 2008, 6, 7. [Google Scholar]
  26. D’Amore, A.; James, S.; Mitchell, E.K.L. Learning styles of first-year undergraduate nursing and midwifery students: A cross-sectional survey utilising the Kolb Learning Style Inventory. Nurse Educ. Today 2012, 32, 506–515. [Google Scholar] [CrossRef] [PubMed]
  27. Shein, P.P.; Chiou, W. Bin Teachers as role models for students’ learning styles. Soc. Behav. Pers. 2011, 39, 1097–1104. [Google Scholar] [CrossRef]
  28. Steinmayr, R.; Meißner, A.; Weidinger, A.F.; Wirthwein, L. Academic Achievement. Oxford Bibliographie; Oxford University Press: New York, NY, USA, 2015. [Google Scholar]
  29. Ackerman, D.S.; Hu, J. Effect of type of curriculum on educational outcomes and motivation among marketing students with different learning styles. J. Mark. Educ. 2011, 33, 273–284. [Google Scholar] [CrossRef]
  30. Urda, J.; Ramocki, S.P. Assessing students’ performance by measured patterns of perceived strengths: Does preference make a difference? Assess. Eval. High. Educ. 2015, 40, 33–44. [Google Scholar] [CrossRef]
  31. Pham, H.; Trigg, M.; Wu, S.; O’Connell, A.; Harry, C.; Barnard, J.; Devitt, P. Choosing medical assessments: Does the multiple-choice question make the grade? Educ. Health Chang. Learn. Pract. 2018, 31, 65–71. [Google Scholar] [CrossRef]
  32. Schuwirth, L.W.T.; Van Der Vleuten, C.P.M.; Donkers, H.H.L.M. A closer look at cueing effects in multiple-choice questions. Med. Educ. 1996, 30, 44–49. [Google Scholar] [CrossRef]
  33. Scouller, K. The influence of assessment method on students’ learning approaches: Multiple choice question examination versus assignment essay. High. Educ. 1998, 35, 453–472. [Google Scholar] [CrossRef]
  34. Palmer, E.J.; Duggan, P.; Devitt, P.G.; Russell, R. The modified essay question: Its exit from the exit examination. Med. Teach. 2010, 32. [Google Scholar] [CrossRef] [PubMed]
  35. Struyven, K.; Dochy, F.; Janssens, S. Students’ perceptions about evaluation and assessment in higher education: A review. Assess. Eval. High. Educ. 2005, 30, 325–341. [Google Scholar] [CrossRef]
  36. Bleske-Recheka, A.; Zeuga, N.; Webbb, R.M. Discrepant performance on multiplechoice and short answer assessments and the relation of performance to general scholastic aptitude. Assess. Eval. High. Educ. 2007, 32, 89–105. [Google Scholar] [CrossRef] [Green Version]
  37. Hogan, T. Constructed-response approaches for classroom assessment. In Sage Handbook of Research on Classroom Assessment; McMillan, J., Ed.; Sage: Thousand Oaks, CA, USA, 2013; pp. 275–293. [Google Scholar]
  38. Bridgeman, B.; Morgan, R. Success in College for Students with Discrepancies between Performance on Multiple-Choice and Essay Tests. J. Educ. Psychol. 1996, 88, 333–340. [Google Scholar] [CrossRef]
  39. Gilar-Corbi, R.; Pozo-Rico, T.; Castejón, J.L.; Sánchez, T.; Sandoval-Palis, I.; Vidal, J. Academic achievement and failure in university studies: Motivational and emotional factors. Sustainability 2020, 12, 9798. [Google Scholar] [CrossRef]
  40. Schneider, M.; Preckel, F. Variables associated with achievement in higher education: A systematic review of meta-analyses. Psychol. Bull. 2017, 143, 565–600. [Google Scholar] [CrossRef] [PubMed]
  41. Yazici, H.J. Role of learning style preferences and interactive response systems on student learning outcomes. Int. J. Inf. Oper. Manag. Educ. 2016, 6, 109. [Google Scholar] [CrossRef]
  42. Lynch, T.G.; Woelfl, N.N.; Steele, D.J.; Hanssen, C.S. Learning style influences student examination performance. Am. J. Surg. 1998, 176, 62–66. [Google Scholar] [CrossRef]
  43. Rutz, E. Learning Styles and Educational Performance: Implications for Professional Development Programs. In Proceedings of the CIEC Conference, Tucson, AZ, USA, 28–31 January 2003. [Google Scholar]
  44. Camarero, F.J.; de Martín, F.A.; Herrero, F.J. Estilos y estrategias de aprendizaje en estudiantes universitarios. Psicothema 2000, 12, 615–622. [Google Scholar]
  45. Oughton, J.M.; Reed, W.M. The effect of hypermedia knowledge and learning style on student—centered concept maps about hypermedia. J. Res. Comput. Educ. 2000, 32, 366–384. [Google Scholar] [CrossRef]
  46. Cornwell, J.M.; Manfredo, P.A. Kolb’S Learning Style Theory Revisited. Educ. Psychol. Meas. 1994, 54, 317–327. [Google Scholar] [CrossRef]
  47. Socarrás, V.; Donat, R.; Fornons, D.; Vaque, C.; Milà, R. Estilos de aprendizaje identificados según el modelo VARK y el cuestionario de Kolb: Implicación en la Educación para la Salud. Cult. Educ. Soc. 2015, 6, 63–76. [Google Scholar]
  48. La Vida Secreta de los Niños: ¡Es el cumpleaños de Eneko! [Video]. Available online: https://www.youtube.com/watch?v=Y3NSTCosyqA (accessed on 11 October 2018).
  49. Maya, J.; Maraver, J. Teaching-learning processes: Application of educational psychodrama in the university setting. Int. J. Environ. Res. Public Health 2020, 17, 3922. [Google Scholar] [CrossRef]
  50. IBM Corp. IBM SPSS Statistics for Windows, Version 26.0; IBM Corp.: Armonk, NY, USA, 2019. [Google Scholar]
  51. Bhattacharyya, E. Learning Style and Its Impact in Higher Education and Human Capital Needs. Procedia Soc. Behav. Sci. 2014, 123, 485–494. [Google Scholar] [CrossRef] [Green Version]
  52. Newton, P.M.; Miah, M. Evidence-based higher education—Is the learning styles «myth» important? Front. Psychol. 2017, 8, 444. [Google Scholar] [CrossRef] [Green Version]
  53. Dryjanska, L.; Giua, M. Diversified path to the psychological career: Europe vs. USA. Int. Psychol. Bull. 2016, 20, 51–56. [Google Scholar]
  54. Yonker, J.E. The relationship of deep and surface study approaches on factual and applied test-bank multiple-choice question performance. Assess. Eval. High. Educ. 2011, 36, 673–686. [Google Scholar] [CrossRef]
  55. Bataineh, Z.M. Comparison of Students’ Performance in Multiple-Choice vs. Short Answer Question Formats in Anatomy Spotter Exams. FASEB J. 2019, 33, 439.3. [Google Scholar] [CrossRef]
  56. Li, C.; Zhou, H. Enhancing the Efficiency of Massive Online Learning by Integrating Intelligent Analysis into MOOCs with an Application to Education of Sustainability. Sustainability 2018, 10, 468. [Google Scholar] [CrossRef] [Green Version]
  57. Roediger, H.L.; Karpicke, J.D. The Power of Testing Memory: Basic Research and Implications for Educational Practice. Perspect. Psychol. Sci. J. Assoc. Psychol. Sci. 2006, 1, 181–210. [Google Scholar] [CrossRef] [PubMed]
  58. Sáinz, M.; Martínez-Cantos, J.L.; Rodó-de-Zárate, M.; Romano, M.J.; Arroyo, L.; Fàbregues, S. Young Spanish People’s Gendered Representations of People Working in STEM. A Qualitative Study. Front. Psychol. 2019, 10, 996. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  59. González-Zamar, M.D.; Abad-Segura, E. Implications of Virtual Reality in Arts Education: Research Analysis in the Context of Higher Education. Educ. Sci. 2020, 10, 225. [Google Scholar] [CrossRef]
  60. Magulod, G. Learning styles, study habits and academic performance of Filipino University students in applied science courses: Implications for instruction. J. Tech. Sci. Educ. 2019, 9, 2. [Google Scholar] [CrossRef] [Green Version]
  61. Wang, R.; Lowe, R.; Newton, S.; Kocaturk, T. Task complexity and learning styles in situated virtual learning environments for construction higher education. Autom. Constr. 2020, 113, 103148. [Google Scholar] [CrossRef]
  62. Buldu, M.; Buldu, N. Concept mapping as a formative assessment in college classrooms: Measuring usefulness and student satisfaction. Procedia Soc. Behav. Sci. 2010, 2, 2099–2104. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Spatial image of the distribution of the participants’ scores according to learning styles as classified by Kolb and Kolb [17].
Figure 1. Spatial image of the distribution of the participants’ scores according to learning styles as classified by Kolb and Kolb [17].
Sustainability 13 03341 g001
Table 1. Descriptive statistics, Pearson correlations, and comparisons of means and frequencies according to learning dimensions and demographic and educational variables.
Table 1. Descriptive statistics, Pearson correlations, and comparisons of means and frequencies according to learning dimensions and demographic and educational variables.
AgeSexDegree
FemaleMalet TestPsychologyE.C. Ed.Prim. Ed.ANOVA
Learning Dimensions
M (SD)M (SD) M (SD)M (SD)M (SD)
Perceptionr = −0.032, p = 0.5904.89 (9.79)6.66 (10.50)t (287) = −1.20,
p = 0.232
8.77 (11.66)3.99 (8.75)4.35 (9.42)F (2,286) = 5.39,
p = 0.005
CEr = 0.066, p = 0.27225.91 (5.64)25.66 (6.38)t (287) = 0.28,
p = 0.777
25.36 (6.83)30.56 (5.98)28.72 (5.14)F (2,286) = 0.62,
p = 0.536
ACr = 0.009, p = 0.87730.79 (5.80)32.32 (6.70)t (287) = −1.71,
p = 0.087
34.13 (6.40)29.56 (5.42)30.57 (5.69)F (2,286) = 12.09,
p < 0.001
Processingr = −0.038, p = 0.5224.97 (9.35)4.23 (10.79)t (287) = 0.52,
p = 0.605
3.98 (10.74)3.75 (9.71)5.76 (9.04)F (2,286) = 1.41,
p = 0.244
AEr = −0.068, p = 0.25334.14 (5.50)33.13 (7.30)t (287) = 1.15,
p = 0.249
32.25 (6.36)34.31 (5.38)34.48 (5.84)F (2,286) = 3.47,
p = 0.033
ROr = −0.005, p = 0.93129.16 (28.89)28.89 (5.95)t (287) = 0.32,
p = 0.748
28.27 (6.01)30.56 (5.98)28.72 (5.14)F (2,286) = 3.68,
p = 0.027
AgeSex University Degree
FemaleMaleχ2 TestPsychologyE.C. Ed.Prim. Ed.χ2 Test
Learning Styles
DivergingF (3,280) = 0.52,
p = 0.668
74 (rz = 0.9) 14 (rz = −0.9)χ2(3) = 0.47, p = 0.470, VCramer = 0.0911 (rz = −2.6)28 (rz = 1.3) 49 (rz = 1) χ2 (6) = 15.77, p = 0.015, VCramer = 0.16
Assimilating53 (rz = −1.3)17 (rz = 1.3)26 (rz = 3.5)16 (rz = −0.8)28 (rz = −2.2)
Converging 35 (rz = −0.6)10 (rz = 0.6)11 (rz = −2.6) 9 (rz = −1.1) 25 (rz = 0.6)
Accommodat.72 (rz = 0.8)14 (rz = −0.8)16 (rz = −0.9)24 (rz = 0.3)46 (rz = 0.5)
CE: concrete experience; AC: abstract conceptualization; AE, active experimentation; RO: reflective observation; E.C. Ed: early childhood education; Prim. Ed.: primary education.
Table 2. Descriptive statistics, comparisons of means and frequencies of learning dimensions and learning styles according to assessment method and level of consistency.
Table 2. Descriptive statistics, comparisons of means and frequencies of learning dimensions and learning styles according to assessment method and level of consistency.
Assessment Method and PerformanceLearning DimensionsLearning Style
Perception
(AC—CE)
ANOVAProcessing
(AE—RO)
ANOVADivergent AssimilatingConvergentAccommodatingTest χ2
M (SD)M (SD)
MCQ
Low2.09 (11.02)F (2,134) = 4.96,
p = 0.008
4.30 (11.09)F (2,134) = 0.12,
p = 0.885
18 (rz = 2)8 (rz = −2.2)4 (rz = −1.2)16 (rz = 1.2)χ2 (6) = 8.85, p = 0.182, VCramer = 0.18
Medium7.84 (10.35)3.89 (9.50)9 (rz = −1.5)16 (rz = 1.1)8 (rz = 0.9)12 (rz = −0.3)
High7.80 (8.69)3.28 (9.18)12 (rz = −0.4)16 (rz = 1)7 (rz = 0.3)11 (rz = −0.8)
SQ
Low2.42 (8.74)F (2,165) = 1.63,
p = 0.199
4.50 (9.30)F (2,165) = 1.07,
p = 0.344
22 (rz = 1)10 (rz = −0.5)4 (rz = −1.1)16 (rz = 0.2)χ2 (6) = 6.24, p = 0.397, VCramer = 0.13
Medium4.02 (8.97)2.80 (9.49)22 (rz = 0.7)14 (rz = 1)5 (rz = −0.7)13 (rz = −1.1)
High5.31 (7.83)5.37 (9.76)18 (rz = −1.6)12 (rz = −0.5)11 (rz = 1.8)21 (rz = 0.9)
Activity
Low4.62 (9.42)F (2,176) = 0.39,
p = 0.678
4.68 (10.90)F (2,176) = 0.43,
p = 0.650
19 (rz = 0)13 (rz = −1.4)10 (rz = 1.1)21 (rz = 0.6)χ2 (6) = 8.70, p = 0.191, VCramer = 0.16
Medium5.10 (11.32)3 (10.46)18 (rz = 0)20 (rz = 1.4)2 (rz = −2.6)20 (rz = 0.5)
High6.27 (10.36)4.16 (9.13)17 (rz = 0)15 (rz = 0)10 (rz = 1.5)14 (rz = −1.1)
EQRTP
Low5.27 (9.26)F (2,229) = 2.54,
p = 0.081
3.19 (10.11)F (2,229) = 0.80,
p = 0.451
26 (rz = 0.7)22 (rz = 0.6)4 (rz = −2.5)23 (rz = 0.5)χ2 (6) = 14.19, p = 0.028, VCramer = 0.18
Medium3.70 (10.21)4.18 (9.32)33 (rz = 2)18 (rz = −1.3)12 (rz = 0.4)20 (rz = −1.1)
High7.18 (9.35)5.23 (10.22)14 (rz = −2.8)22 (rz = 0.7)15 (rz = 2.1)23 (rz = 0.6)
Level of Consistency and Performance
Medium-High7.18 (9.75)F (2,216) = 3.60,
p = 0.029
4.79 (8.90)F (2,216) = 0.12,
p = 0.887
20 (rz = −1.4)20 (rz = 0)17 (rz = 2.6)21 (rz = −0,6)χ2 (6) = 10.13, p = 0.119, VCramer = 0.15
Medium-Low2.96 (10.32)4.34 (9.70)29 (rz = 1.7)16 (rz = −1)5 (rz = −2.1)24 (rz = 0.7)
Inconsistency5.28 (8.90)4 (10.87)20 (rz = −0.4)20 (rz = 1)8 (rz = −0.5)19 (rz = −0.2)
CE: concrete experience; AC: abstract conceptualization; AE: active experimentation; RO: reflective observation; MCQ: multiple choice question; SQ: short question; EQRTP: Elaboration Question on the Relationship between Theory and Practice.
Table 3. Groups with consistent medium-high and medium-low performance: binomial logistic regression model and classification results.
Table 3. Groups with consistent medium-high and medium-low performance: binomial logistic regression model and classification results.
Nagelkerke’s R2BWald χ2OR
Model0.061 *
Perception −0.046.49 **0.957
Processing −0.010.390.989
Predicted n% of Correct Classifications
Classification
Observed nMedium-High512765.4%
Medium-Low353952.7%
59.2%
* p < 0.05. ** p < 0.01.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Maya, J.; Luesia, J.F.; Pérez-Padilla, J. The Relationship between Learning Styles and Academic Performance: Consistency among Multiple Assessment Methods in Psychology and Education Students. Sustainability 2021, 13, 3341. https://doi.org/10.3390/su13063341

AMA Style

Maya J, Luesia JF, Pérez-Padilla J. The Relationship between Learning Styles and Academic Performance: Consistency among Multiple Assessment Methods in Psychology and Education Students. Sustainability. 2021; 13(6):3341. https://doi.org/10.3390/su13063341

Chicago/Turabian Style

Maya, Jesús, Juan F. Luesia, and Javier Pérez-Padilla. 2021. "The Relationship between Learning Styles and Academic Performance: Consistency among Multiple Assessment Methods in Psychology and Education Students" Sustainability 13, no. 6: 3341. https://doi.org/10.3390/su13063341

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop