Paper The following article is Free article

Light phenomena conceptual assessment: an inventory tool for teachers

, , and

Published 13 February 2020 © 2020 IOP Publishing Ltd
, , Citation Kizito Ndihokubwayo et al 2020 Phys. Educ. 55 035009 DOI 10.1088/1361-6552/ab6f20

0031-9120/55/3/035009

Abstract

Light has the most interesting phenomena among physics concepts. We designed the light phenomena conceptual assessment (LPCA) to help teachers measure their students' conceptual understanding of light phenomena. We expected to measure increases in student understanding of light phenomena after learning about the wave and particle nature of light in Rwandan secondary schools. We analyzed the results of 244 physics students using descriptive and inferential statistics. The data revealed a low understanding of light phenomena, and this low understanding is connected to instructional tools and strategies used by teachers. Students confused reflection and refraction of light. They also struggled with understanding total internal reflection and light scattering. Therefore, teachers should teach optics by allowing students to observe related phenomena in order to more effectively promote student conceptual understanding of light phenomena.

Export citation and abstract BibTeX RIS

1. Introduction

In daily teaching activities, teachers are occupied with deciding how to deliver a lesson such that students fully understand what has been taught. Teachers may use various teaching tools such as lab equipment or computer simulations. They may use teaching strategies like group discussion of a physics topic or students drawing a phenomenon. However, student misconceptions persist even after the initial teaching and learning process. Optics is one area of physics where these persistent misconceptions and gaps in understanding have been identified in past research. Students may know about theories of light but still fail to interpret the interference that results from the wave nature of light [1]. A 'misconception' is the misunderstandings of facts [2]. These are difficult to correct when they are already fixed within a student's schema.

Misconceptions present a conflict of one's understanding and experience with a new experience. One thing that makes misconceptions durable is superficial knowledge. For instance, when a teacher's instruction leaves an important aspect of the concept unexplained, their students must confront the gap in their understanding when they meet with a new situation where they would apply that concept. However, unawareness of a concept is different from a misconception. If someone does not know what refraction of light is, they have a chance to know it if they conduct laboratory observations to distinguish refraction from light scattering. However, a misconception like a belief that the sky is blue from a reflection of the ocean in the sky is caused by having an incomplete conceptual understanding of light scattering. Therefore, teachers need to help learners deeply understand physics concepts by avoiding student misconceptions. A study by Kaewkhong, Mazzolini, and Emarat [3] showed that students anticipate difficulties related to virtual images even after completing a number of optics courses. Subsequently, Gurel, Eryilmaz, and Mcdermott [4] found that student misconceptions associated with investigations in plane mirrors, spherical mirrors and lenses were connected to the mode of instruction used by teachers. In a study by Özcan [5], students were not able to draw a good model of the wave and particle nature of light.

Despite having identified these areas of difficulty associated with light phenomena, few academic studies have investigated the process of learning about them. There is also a paucity of inventories broadly focused on the light phenomena. The Mechanical Waves Conceptual Survey 2 (MWCS2) of Barniol and Zavala [6] focused specifically on waves and optics content knowledge in propagation, superposition, reflection, and standing waves. The four-tier geometrical optics test (FTGOT) of Kaltakci-gurel, Eryilmaz, and Mcdermott [7] focused on waves and optics content knowledge in plane mirrors, spherical mirrors, and lenses. The wave diagnostic test (WDT) of Wittmann [8] focused on waves and optics content knowledge of waves generally. The light and spectroscopy concept inventory (LSCI) of Bardar, Prather, Brecher, and Slater [9] focused on astronomy content knowledge in spectroscopy, light, and waves. The light and optics conceptual evaluation (LOCE) of Lakhdar et al [10] focused on conceptual understanding of optics. In contrast to all of these works, we developed an inventory for teachers seeking to explore light phenomena with their secondary students as they build conceptual understanding. Specifically, our study has revealed how Rwandan secondary school physics students understand light phenomena while studying the wave and particle nature of light. The present study is built on the didactic transposition theory [11, 12]. The theory builds a connection of three elements within the teaching and learning process: the teacher, the students, and the content. In other words, we should know who teaches what to whom. In this theory, the emphasis is placed on the value of knowledge taught. Learning should be used to respond to the needs of society.

2. Research design and validation of instruments

From the literature, Rwandan competence-based curriculum (CBC) textbooks, daily experiences, and students' group discussions we formulated 44 questions related to light phenomena at a stage of validation. A sample of 65 students took the test for face validity. We asked them to complete the assessment and report their impression regarding the difficulty of each question. We also asked them to justify their judgement. Based on reported difficulty levels and points of confusion, we removed 14 items. The remaining 30 questions represent the final light phenomena conceptual assessment (LPCA). The students' predictions for which items were most difficult aligned each question's difficulty index. The 30 items were content validated by four expert university lecturers. Prior to implementing the LPCA, we piloted it with 25 secondary school physics students for reliability using a test–retest method. We administered the same test twice within a window of four to six weeks. We found a medium level of reliability, with a coefficient alpha of 0.55. The final 30-item instrument is a multiple choice test with four answer choices, including one correct response and three distractors (see the whole instrument in the supplementary material, available online stacks.iop.org/PED/55/035009/mmedia).

During the implementation of this test, we sampled eight schools from rural and urban settings in Rwanda. We selected two boarding schools and two day schools from both an urban and rural area. Students live at boarding schools while students spend the night at their home and come to study during the daytime at day schools. We did not apply any instructional interventions and each instructor taught as they usually do. We administered the LPCA before each class began the 'wave and particle nature of light' unit and again after teaching it. The duration of the test was 40 min for all students who sat for it. We administered the test to a total of 283 senior 5 (grade 11) students for the pretest stage and 278 for the posttest. We identified the students who participated in both the pre and post-test and found 244 students. For these 244 students, we calculated the sample mean and standard deviation, minimum and maximum scores, and frequency of each individual item response. We then calculated a t-Test, learning gain, and Cohen's D effect size among pre and post-test.

3. Findings

The mean pre-test score was 37.00% with a standard deviation of 10.65. The post-test mean score was 40.95% with a standard deviation of 11.17. The minimum score for the pre-test and post-test was 10.00% and 13.33% respectively, while the maximum score for both was 76.67%. So, although the learning gain of 3.95% made a statistically significant difference (p   <  0.05) after teaching, the effect of this difference was found to be small (D  =  0.29). This conclusion is based on the definition of Cohen [13] that the Cohen's D effect size which is smaller than 0.3 will be considered small while it needs to be higher than 0.8 to be interpreted large effect size.

A good assessment should tell us what students know. That means, it should reflect students' varied levels of understanding. It should also be fair, and students with a better understanding should score higher. Therefore, we calculated the difficulty and discrimination indexes. Since the test was measuring a single construct, in 30 items, we found one question with a negative discrimination index (item 28) of  −0.09 in the pre-test. On the post-test, two items had a negative discrimination index. Item 18's discrimination index was  −0.05, and item 28's index was  −0.01. The average difficulty index was 0.37 in the pre-test and 0.4095 in the post-test (figure 1). The average discrimination index for the entire assessment was 0.26 and 0.27 in the pre and post-test, respectively.

Figure 1.

Figure 1. Overall performance of students along with all questions both at pre and post-test.

Standard image High-resolution image

The students performed differently on individual questions, but there is no big difference between pre-test and post-test overall. Generally, questions 5 and 24 were found to be the easiest while questions 18 and 22 were the most difficult for students (see figure 1). However, there is a difference in the range of scores between the pre and post-test. For instance, the students showed improvement at post-test among the easiest questions. However, they scored lower in post-test compared to pre-test among the most difficult questions. About 76.33% and 80.94% of students answered correctly on question 5 while 69.26% and 79.14% of students answered correctly on question 24 at pre and post-test, respectively. Thus, there is an increase of students from pre to post-test who answered correctly on questions 5 and 24. However, this was not the case for questions 18 and 22. Instead of more students answering correctly in post-test, their number decreased. Question 18 went from 8.13% to 7.19% of students answering correctly. On question 22, the percent answering correctly went from 17.67% to 15.11%. Using this analysis, we identified all the questions which exhibit this pattern. We found out that 10 out of 30 questions confused students such that they scored lower in post-test compared to the pre-test. We further discuss these ten questions, and the related potential misconceptions, in the discussion section below.

In figure 2, one can see that 15.45% of students scored higher than 50% at pre-test while 22.36% of students scored higher than 50% at post-test. However, the frequency of individual answer choices are not far from each other for many questions (see figure 3). From figure 3, we can see that students only most chose the best answer 13 and 18 out of 30 questions in pre and post-test respectively. Labeled questions show where the correct answers outperformed their counterpart distractors.

Figure 2.

Figure 2. Histogram at pre and post-test.

Standard image High-resolution image
Figure 3.

Figure 3. (a) Answer choice at pre-test. (b) Answer choice at post-test.

Standard image High-resolution image

The letters under each item number shows the correct letter among A, B, C, and D choices. The letter 'K' identifies the percentage of students who failed to give a viable answer to the question. This means the student either did not select any of the four letters or selected more than one letter.

The findings show that in 10 out of 30 questions the students performed better in pre-test than in post-test (figure 1). These questions are 3, 6, 7, 13, 16, 18, 19, 22, 23, and 30. In figures 3(a) and (b), these questions have similarities in performance. Of these ten questions, five questions (7, 13, 18, 19, and 22) have something in common. Fewer students are choosing correct answers in post-test; these questions show the dominance of distractors. This means students have selected the wrong letter at both pre and post-test. Despite the number of students choosing the correct answer falling, the majority of students chose the correct answer at both pre and post-test in the remaining questions (3, 6, 16, 23, and 30). We also found questions with an increase in student performance from pre to post-test, but low overall item scores at both pre and post-test. These questions are 17, 22, 25 and 28 (see figures 1 and 3(a), (b)).

4. Discussion and potential misconceptions

The fact that many students could answer correctly on questions 5 and 24 may be from daily experience and other related science subjects. Question 5 (see supplementary data) was related to measuring daytime by using a human shadow. In Rwanda, students, and even local people, can predict the time using the position of the sun in the sky. In daytime the sun is located above our head. In that case, our shadow is beneath our feet and it must be 12:00 o'clock. On the other hand, question 24 was about the Ozone layer in the atmosphere. This topic is discussed in physics, but also elsewhere in school. It is also discussed in biology, chemistry, and geography where they discuss the greenhouse effect and how the Ozone layer protects us from harmful ultra-violet radiation from the sun. Thus, students are probably experienced with the position of the sun and the importance of the Ozone layer.

The fact that many students could not correctly answer questions 18 and 22 is probably related to teachers' deficient teaching. Question 18 asked, 'Why do sunsets over the sea seem to be orange?' Students are not aware that it is due to the salt particles in the air. Question 22 asked, 'Why do we see blue instead of the violet sky while violet has a shorter wavelength than blue?' The students are not aware that it is due to wavelength intensity, which is not the same for all colors in sunlight, and that the atmosphere absorbs more violet light. These topics are discussed in the 'wave and particle nature of light' unit as indicated in the Rwanda physics CBC syllabus [14]. Therefore, teachers do not spend time explaining these concepts using observation. Teachers may also struggle to teach these concepts due to a lack of instructional materials which enable students to visualize them well. Another reason is related to the students' experience. These topics are new to their level and understanding them is difficult. Their low performance on these questions is probably related to confusion after studying. This would also explain the limited improvement after learning (at the post-test).

Let us now discuss questions 7, 13, 17, 19, 25, and 28. Question 7 asked, 'Why do some objects like a block of wood and paper cannot reflect your image?' Instead of answering that these objects are organic, not luminous and smooth, 36.40% of students at pre-test and 32.37% of students at post-test said, 'These objects are inorganic, luminous and not smooth'. Only 34.98% at pre-test and 26.26% at post-test answered correctly (letter A). Question 13 asked, 'Why is a rainbow always circular?' Instead of answering that these are the only droplets that we can see because of the critical angle that gives the total internal reflection of the sun rays falling on it, 38.87% at pre-test and 35.61% at post-test of students said, 'It is because of the critical angle that gives the total internal refraction of the sun rays falling on it'. Only 33.22% of students at pre-test and 29.86% of students at post-test answered it correctly (letter C). These questions (7 and 13) are related to geometric optics. The students who performed this test are in senior five, the final academic year of discussing optics. They have been studying geometric optics from senior one until senior four and physical optics in senior five. However, their mastery of the content is still missing. On question 7, students met difficulties knowing what a smooth surface is and its role in reflecting light. On question 13, students did not understand the concept of total internal reflection and most of them confused reflection and refraction. These difficulties may arise from student attempts to revise their understanding, or they may arise from the way they learn this concept initially. For instance, if the teachers do not provide enough hands-on activities with total internal reflection, students will continue to confuse reflection with refraction concepts.

Question 17 asked, 'Why is the sunlight-visible a few minutes before the sunrise above the horizon and after it goes below the horizon during sunset?' Instead of answering that the air is denser near the Earth's surface and becomes thinner as we move away from the Earth, 42.76% of students at pre-test and 45.68% of students at post-test said, 'It is because the refractive index of the earth's atmosphere decreases as we move closer to its surface'. Only 25.44% at pre-test and 25.90% at post-test answered it correctly (letter A). Question 19 asked, 'What does make sunlight beam coming through a window or a beam of light from car headlights visible?' Instead of answering that, in the transparent medium like air, there are particles of smoke, dust, and water to scatter the light, 38.87% of students at pre-test and 38.13% of students at post-test said, 'It is the transparent medium like air, the particle smoke, dust, water in the air gather the light'. Only 37.46% at pre-test and 36.69% at post-test answered it correctly (letter A). Questions 17 and 19 are related to daily life examples. They are natural phenomena that should be explored by the teacher during a class activity. It is clear that students are not able to transfer the concept of refraction from the blackboard to the world outside the classroom (question 17). Similarly, students do not understand the concept of light scattering. This is probably caused by their teachers not using typical examples in the case of refraction and observation in the case of scattering phenomenon. Teachers do not give enough examples, and attempt to explain phenomena by lecture without observation.

Question 25 asked about another use for ultraviolet radiations, apart from sterilizing equipment in the medical industry and in water purification to kill bacteria. Instead of answering that it is used by dentists to check for teeth defects and by the bank to check for fake currency notes, 38.87% of students at pre-test and 37.41% of students at post-test said, 'It is used to kill cancer cells, and in space observatories'. Only 20.49% at pre-test and 20.86% at post-test answered it correctly (letter D). Question 28 asked why in the sky we sometimes observe a circle of color around the sun. Instead of answering that it is due to refraction from ice crystals, 38.16% of students at pre-test and 43.17% of students at post-test said, 'It is due to the refraction from clouds'. Only 14.49% at pre-test and 16.19% at post-test answered it correctly (letter D). Questions 25 and 28 are both discussing physical phenomena associated with light. Question 25 is an application of the light spectrum. Students do not master the application parts of the spectrum, based on their energy or wavelength, due to little instructional time and insufficient explanation provided to students. Question 28 reflects what is happening in the sky. However, students struggle to understand the process behind ice crystals making a circle of colors in the sky. This is also related to the limited effort made by teachers to take students into the environment for exploring and observing nature. Teachers should ask them to collect data and interpret what they have observed.

The CBC emphasizes the practicability of what students learn. It is more beneficial to use knowledge than know it. In fact, the students show misconceptions about light phenomena and are not able to interpret theories of optics in order to understand the facts behind these phenomena. This is where the theory of didactic transposition can be applied. Teachers should teach the most needed knowledge. They should ensure that knowledge is well-conceived, and will be able to be interpreted scientifically by the students. Content knowledge should be carefully selected by teachers, and teachers should be equally careful to select an appropriate teaching method in order to address student misconceptions.

5. Summary, usability of the inventory, and further research foci

In this study, we intended to check the level of understanding of light phenomena among secondary school physics students. We developed an inventory tool for teachers. Our study used a pre-post-test design before and after teaching the wave and particle nature of light unit in CBC in Rwandan secondary schools. Our study sample included both rural and urban schools, as well as day and boarding schools. There was no intervention given apart from the normal teaching of each participating instructor. Although we found a statistically significant difference after teaching, students have a low understanding of light phenomena and the effect size was small (D  =  0.29). Therefore, we concluded that teachers are using insufficient teaching tools and strategies to help students grasp and understand light phenomena. Some potential student misconceptions were also identified in this study. These included difficulty distinguishing reflection from refraction of light, understanding topics like total internal reflection and light scattering, and applying examples of radiation to daily life. The authors recommend teachers use phenomena-based instruction, as this attracts the students' attention and develops conceptual understanding. Teachers should find a related concept, present it in a visual way, and let students explain and interpret the observed phenomena. Their observations may be taken from various sources, such as YouTube videos [15], computer simulations [16], and pictures or drawings. The LPCA inventory tool served as a useful tool for physics teachers to assess students' light phenomena conceptual understanding before and after teaching physical optics-related concepts, administered in 40 min. Future studies should investigate the conceptual understanding of light phenomena across all educational levels, from primary schools to university students.

Acknowledgments

This research has been fully funded by the African Centre of Excellence for Innovative Teaching and Learning Mathematics and Science (ACEITLMS). The authors wish to thank Prof Scott Franklin, Rochester Institute of Technology, NY, US and Prof Eleanor Syre, Kansas State University, KS, US for their contribution in validating the instrument used in this study. The authors also gratefully acknowledge all schools and teachers who allowed us to collect data with their students.

Please wait… references are loading.

Biographies

Ndihokubwayo Kizito

Ndihokubwayo Kizito is currently a Ph.D. student in physics education at the African Centre of Excellence for Innovative Teaching and Learning Mathematics and Science (ACEITLMS), University of Rwanda College of Education (URCE). He is a consultant in educational projects and a researcher in the science education field.

Dr. Jean Uwamahoro

Dr. Jean Uwamahoro has a Ph.D. in physics, specializing in Space Physics. He currently holds a teaching position as an Associate Professor of physics at the University of Rwanda, College of Education. He is also the Deputy Director of the African Centre of Excellence for Innovative Teaching and Learning Mathematics and Science (ACEITLMS).

Dr. Ndayambaje Irénée

Dr. Ndayambaje Irénée is the Director-General of Rwanda Education Board (REB) since February 14, 2018. Prior the Cabinet Appointment to this position, he was a Lecturer at the University of Rwanda-College of Education (UR-CE). He holds a Ph.D. in Educational Planning from Kenyatta University, Kenya. He has published a number of research papers in peer-reviewed journals and participated in a number of textbooks and reference books writing projects.

Michael Ralph

Michael Ralph is a master teacher at the Center for STEM Learning (CSTEM) in the College of Liberal Arts & Sciences at the University of Kansas (KU) in Lawrence, KS. CSTEM operates to improve STEM literacy for all people, primarily through the operation of the UKanTeach teacher certification program for STEM undergraduates. He teaches Research Methods for STEM majors and supports other pedagogy-focused courses as a secondary instructor.

10.1088/1361-6552/ab6f20