Skip to main content

Advertisement

Log in

Analyzing “real-world” anomalous data after experimentation with a virtual laboratory

  • Development Article
  • Published:
Educational Technology Research and Development Aims and scope Submit manuscript

Abstract

Developing effective pedagogies to help students examine anomalous data is critical for the education of the next generation of scientists and engineers. By definition anomalous data do not concur with prior knowledge, theories and expectations. Such data are the common outcome of empirical investigation in hands-on laboratories (HOLs). These aberrations can be counter intuitive for students when they investigate real-world processes with technology tools, such as virtual laboratories (VRLs), that may project a simplified view of data. This study blended learning with a VRL and the examination of real-world data in two engineering classrooms. The results indicated that students developed new knowledge with the VRL and were able to apply this knowledge to evaluate anomalous data from an HOL. However, students continued to experience difficulties in transferring their newly constructed knowledge to reason about how anomalous data results may have come about. The results provide directions for continued research on students’ perceptions of anomalous data and also suggest the need for evidence-based instructional design decisions for the use of existing VRLs to prepare science and engineering students for real-world investigations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Ainsworth, S. (2008). How do animations influence learning? In D. Robinson & G. Schraw (Eds.), Current innovations in educational technology that facilitate student learning (pp. 37–67). Charlotte, NC: Information Age Publishing.

    Google Scholar 

  • Allchin, D. (2001). Error types. Perspectives on Science, 9(1), 37–58.

    Article  Google Scholar 

  • Apkan, J., & Strayer, J. (2010). Which comes first the use of computer simulation of frog dissection or conventional dissection as academic exercise? Journal of Computers in Mathematics and Science Teaching, 29(2), 113–138.

    Google Scholar 

  • Balamuralithara, B., & Woods, P. C. (2009). Virtual laboratories in engineering education: The simulation lab and remote lab. Computer Applications in Engineering Education, 17(1), 108–118.

    Article  Google Scholar 

  • Caprette, D. (1996). Rice University STS-Page Hall of Shame. Available at http://www.ruf.rice.edu/~bioslabs/studies/sds-page/sdsgoofs.html. Retrieved April 29, 2015.

  • Carey, S., & Smith, C. (1993). On understanding the nature of scientific knowledge. Educational Psychologist, 28(3), 235–251.

    Article  Google Scholar 

  • Chandola, V., Banerjee, A., & Kumar, V. (2009). Anomaly detection: A survey. ACM Computing Surveys, 41(3), 15.1–15.58.

    Article  Google Scholar 

  • Chen, S. (2010). The view of scientific inquiry conveyed by simulation-based virtual laboratories. Computers and Education, 55(3), 1123–1130.

    Article  Google Scholar 

  • Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: A practical guide. The Journal of the Learning Sciences, 6(3), 271–315.

    Article  Google Scholar 

  • Chinn, C. A., & Brewer, W. F. (1993). The role of anomalous data in knowledge acquisition: A theoretical framework and implications for science instruction. Review of Educational Research, 63, 1–49.

    Article  Google Scholar 

  • Chinn, C. A., & Brewer, W. F. (1998). An empirical test of taxonomy of responses to anomalous data in science. Journal of Research in Science Teaching, 35(6), 623–654.

    Article  Google Scholar 

  • Chinn, C. A., & Brewer, W. F. (2001). Models of data: A theory of how people evaluate data. Cognition and Instruction, 19(3), 323–393.

    Article  Google Scholar 

  • Clark Plano, V. L., & Creswell, J. W. (2008). Mixed methods reader. Thousand Oaks: Sage.

    Google Scholar 

  • Cobb, S., Heaney, R., Corcoran, O., & Henderson-Begg, S. (2009). The learning gains and student perceptions of a second life virtual lab. Bioscience Education. doi:10.3108/beej.13.5.

    Google Scholar 

  • Corter, J. E., Esche, S. K., Chassapis, C., Ma, J., & Nickerson, J. V. (2011). Process and learning outcomes from remotely-operated, simulated, and hands-on student laboratories. Computers and Education, 57(3), 2054–2067.

    Article  Google Scholar 

  • de Jong, T., Linn, M. C., & Zacharia, Z. C. (2013). Physical and virtual laboratories in science and engineering education. Science, 340(6130), 305–308.

    Article  Google Scholar 

  • De Jong, T., & Van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research, 68(2), 179–201.

    Article  Google Scholar 

  • Dede, C., Brown-L’Bahy, T., Ketelhut, D., & Whitehouse, P. (2004). Distance learning (virtual learning). The Internet Encyclopedia. doi:10.1002/047148296X.tie047.

    Google Scholar 

  • Dede, C., Salzman, M., Loftin, R. B., & Ash, K. (1997). Using virtual reality technology to convey abstract scientific concepts. Learning the sciences of the 21st century: Research, design, and implementing advanced technology learning environments. Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Dekker, S. (2006). The field guide to understanding human error. Burlington: Ashgate Publishing.

    Google Scholar 

  • Denzin, N. K., & Lincoln, Y. S. (2005). Handbook of qualitative research (2nd ed.). Sage: Thousand Oaks.

    Google Scholar 

  • Dickey, M. D. (2011). The pragmatics of virtual worlds for K-12 educators: Investigating the affordances and constraints of Active Worlds and Second Life with K-12 in-service teachers. Educational Technology Research and Development, 59(1), 1–20.

    Article  Google Scholar 

  • Dori, Y. J., & Barak, M. (2003). A web-based chemistry course as a means to foster freshmen learning. Journal of Chemical Education, 80(9), 1084–1092.

    Article  Google Scholar 

  • Dorner, D. (1996). The logic of failure. Recognizing and avoiding error in complex situations. New York: Metropolitan Books.

    Google Scholar 

  • Drenth, P. J. D. (2006). Responsible conduct in research. Science and Engineering Ethics, 12(1), 13–21.

    Article  Google Scholar 

  • Dreyfus, A., Jungwirth, E., & Eliovitch, R. (1990). Applying the “cognitive conflict” strategy for conceptual change. Some implications, difficulties and problems. Science Education, 74, 555–569.

    Article  Google Scholar 

  • Driver, R. (1989). Students’ conceptions and the learning of science. International Journal of Science Education, 11(5), 481–490.

    Article  Google Scholar 

  • Duschl, R. (2008). Science education in three-part harmony: Balancing conceptual, epistemic, and social learning goals. Review of research in education, 32(1), 268–291.

    Article  Google Scholar 

  • Field, A. (2005). Discovering statistics: Using SPSS (2nd ed.). Sage: Singapore.

    Google Scholar 

  • Genetic Science Learning Center (GLSC). (2013). Virtual labs. Available at http://learn.genetics.utah.edu/. Retrieved April 29, 2015.

  • Gong, M., Zhang, J., Ma, J., & Jiao, L. (2012). An efficient negative selection algorithm with further training for anomaly detection. Knowledge-Based Systems, 30, 185–191.

    Article  Google Scholar 

  • Guba, E. G., & Lincoln, Y. S. (1994). Competing paradigms in qualitative research. Handbook of Qualitative Research, 2, 163–194.

    Google Scholar 

  • Han, J., Kamber, M., & Pei, J. (2006). Data mining: Concepts and techniques. San Francisco, CA: Elsevier.

    Google Scholar 

  • Howard Hughes Medical Institute, HHMI (n.d.). BioInteractive: Free resources for science teachers and students. Available at http://www.hhmi.org/biointeractive. Retrieved April 29, 2015.

  • Jackson, S. L., Stratford, S. J., Krajcik, J. S., and Soloway, E. (1996). Model-It: A case study of learner-centered software design for supporting model building. ERIC Clearinghouse.

  • Jonassen, D. H. (2006). Modeling with technology: Mindtools for conceptual change. Pearson: Upper Saddle River, NJ.

    Google Scholar 

  • Kali, Y. (2008). The design principles database as means for promoting design-based research. In A. E. Kelly, et al. (Eds.), Handbook of design research methods in education: Innovations in science, technology, engineering, and mathematics learning and teaching (pp. 423–438). New York: Rutledge.

    Google Scholar 

  • Kirschner, P., & Wopereis, I. G. (2003). Mindtools for teacher communities: A European perspective. Technology, Pedagogy and Education, 12(1), 105–124.

    Article  Google Scholar 

  • Kozma, R. (2003). The material features of multiple representations and their cognitive and social affordances for science understanding. Learning and Instruction, 13(2), 205–226.

    Article  Google Scholar 

  • Leech, N., Barrett, K., & Morgan, G. (2008). SPSS for intermediate statistics: Use and interpretation (3rd ed.). New York: Psychology Press.

    Google Scholar 

  • Limón, M. (2001). On the cognitive conflict as an instructional strategy for conceptual change: a critical appraisal. Learning and Instruction, 11(4), 357–380.

    Article  Google Scholar 

  • Limon, M., & Carretero, M. (1997). Conceptual change and anomalous data: A case study in the domain of natural sciences. European Journal of Psychology of Education, 13(2), 213–230.

    Article  Google Scholar 

  • Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry (Vol. 75). Sage.

  • Ma, J., & Nickerson, J. V. (2006). Hands-on, simulated, and remote laboratories: A comparative literature review. ACM Computing Surveys, 38(3), 7. doi:10.1145/1132960.1132961.

    Article  Google Scholar 

  • Magin, D., & Kanapathipillai, S. (2000). Engineering students’ understanding of the role of experimentation. European Journal of Engineering Education, 25(4), 351–358.

    Article  Google Scholar 

  • Martinson, B. C., Anderson, M. S., & De Vries, R. (2005). Scientists behaving badly. Nature, 435(7043), 737–738.

    Article  Google Scholar 

  • Masnick, A. M., & Klahr, D. (2003). Error matters: An initial exploration of elementary school children’s understanding of experimental error. Journal of Cognition and Development, 4(1), 67–98.

    Article  Google Scholar 

  • Mason, L. (2001). Responses to anomalous data on controversial topics and theory change. Learning and Instruction, 11, 453–483.

    Article  Google Scholar 

  • Miles, M. B., & Huberman, M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Sage: London.

    Google Scholar 

  • Miles, M. B., Huberman, A. M., & Saldaña, J. (2013). Qualitative data analysis: A methods sourcebook. SAGE Publications, Incorporated.

  • MyDNA. (2003). Discovery module 2: Sorting DNA molecules. Available at http://mydna.biochem.umass.edu/modules/sort.html. Retrieved April 29, 2015.

  • Nakhleh, M. B., & Krajcik, J. S. (1993). A protocol analysis of the influence of technology on students’ actions, verbal commentary, and thought processes during the performance of acid-base titrations. Journal of Research in Science Teaching, 30(9), 1149–1168.

    Article  Google Scholar 

  • National Aeronautics and Space Administration (NASA). (n.d.). The virtual lab educational software. Available at http://www.nasa.gov/offices/education/centers/kennedy/technology/Virtual_Lab.html Retrieved April 29, 2015.

  • Nedic, Z., Machotka, J., and Nafalski, A. (2003). Remote laboratories versus virtual and real laboratories. Frontiers in Education, 2003. Vol. 1, pp. T3E–1.

  • Next Generation Science Standards (NGSS). (2012). Available at http://www.nextgenscience.org/next-generation-science-standards. Rretrieved April 29, 2015.

  • Nokes-Malach, T. J., & Mestre, J. P. (2013). Toward a model of transfer as sense-making. Educational Psychologist, 48(3), 184–207.

    Article  Google Scholar 

  • Olympiou, G., & Zacharia, Z. C. (2012). Blending physical and virtual manipulatives: An effort to improve students’ conceptual understanding through science laboratory experimentation. Science Education, 96(1), 21–47.

    Article  Google Scholar 

  • Olympiou, G., Zacharias, Z., & de Jong, T. (2013). Making the invisible visible: Enhancing students’ conceptual understanding by introducing representations of abstract objects in a simulation. Instructional Science, 41(3), 575–596.

    Article  Google Scholar 

  • Piaget, J. (1985). The equilibration of cognitive structures: The central problem of intellectual development. Chicago: University of Chicago Press.

    Google Scholar 

  • Posner, G. J., Strike, K. A., Hewson, P. V., & Gertzog, W. A. (1982). Accommodation of a scientific conception: Toward a theory of conceptual change. Science Education, 66, 211–227.

    Article  Google Scholar 

  • Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., & Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. The Journal of the Learning Sciences, 13(3), 337–386.

    Article  Google Scholar 

  • Rebello, N. S., Zollman, D. A., Allbaugh, A. R., Engelhardt, P. V., Gray, K. E., Hrepic, Z., & Itza-Ortiz, S. F. (2005). Dynamic transfer: A perspective from physics education research. In J. P. Mestre (Ed.), Transfer of learning from a modern multidisciplinary perspective. Greenwich: IAP.

    Google Scholar 

  • Reiser, B. J. (2004). Scaffolding complex learning: The mechanisms of structuring and problematizing student work. The Journal of the Learning Sciences, 13(3), 273–304.

    Article  Google Scholar 

  • Renken, M. D., & Nunez, N. (2013). Computer simulations and clear observations do not guarantee conceptual understanding. Learning and Instruction, 23, 10–23.

    Article  Google Scholar 

  • Rieber, L. P. (1992). Computer-based microworlds: A bridge between constructivism and direct instruction. Educational Technology Research and Development, 40(1), 93–106.

    Article  Google Scholar 

  • Salomon, G., & Perkins, D. N. (1989). Rocky roads to transfer: Rethinking mechanism of a neglected phenomenon. Educational psychologist, 24(2), 113–142.

    Article  Google Scholar 

  • Sandoval, W. A., & Millwood, K. A. (2005). The quality of students’ use of evidence in written scientific explanations. Cognition and Instruction, 23(1), 23–55.

    Article  Google Scholar 

  • Sandoval, W. A., & Reiser, B. J. (2004). Explanation-driven inquiry: Integrating conceptual and epistemic scaffolds for scientific inquiry. Science Education, 88(3), 345–372.

    Article  Google Scholar 

  • Schnotz, W., & Rasch, T. (2005). Enabling, facilitating, and inhibiting effects of animations in multimedia learning: Why reduction of cognitive load can have negative results on learning. Educational Technology Research and Development, 53(3), 47–58.

    Article  Google Scholar 

  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin.

    Google Scholar 

  • Smith, C. L., Maclin, D., Houghton, C., & Hennessey, M. G. (2000). Sixth-grade students’ epistemologies of science: The impact of school science experiences on epistemological development. Cognition and Instruction, 18(3), 349–422.

    Article  Google Scholar 

  • Spector, J. M. (2008). Cognition and learning in the digital age: Promising research and practice. Computers in Human Behavior, 24(2), 249–262.

    Article  Google Scholar 

  • Spector, J. M., & Davidsen, P. I. (2000). Designing technology enhanced learning environments. In B. Abbey (Ed.), Instructional and cognitive impacts of Web-based education (pp. 241–261). Hershey, PA: Idea Group Publishing.

    Chapter  Google Scholar 

  • Spector, J. M., Lockee, B. B., Smaldino, S., & Herring, M. (Eds.). (2013). Learning, problem solving, and mind tools: Essays in honor of David H. Jonassen. Upper Saddle River, NJ: Routledge.

    Google Scholar 

  • SPSS Statistics for Windows. Version 21(2012) IBM Corporation: Armonk, NY.

  • Sweller, J., Van Merrienboer, J. J., & Paas, F. G. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296.

    Article  Google Scholar 

  • Toth, E. E. (2009). “Virtual Inquiry” in the science classroom: What is the role of technological pedagogical content knowledge? International Journal of Communication Technology in Education, 5(4), 78–87.

    Article  Google Scholar 

  • Toth, E. E., Brem, S. K., & Erdos, G. (2009). “Virtual inquiry”: Teaching molecular aspects of evolutionary biology through computer-based inquiry. Evolution: Education and Outreach2(4), 679–687.

    Google Scholar 

  • Toth, E. E., Dinu, C. Z., McJilton, L., & Moul, J. (2012a). Interdisciplinary collaboration for educational innovation: Integrating inquiry learning with a virtual laboratory to an engineering course on “cellular machinery”. In. T. Amiel & B. Wilson (Eds.), Proceedings of EdMedia: World conference on educational media and technology 2012 (pp. 1949–1956). Association for the Advancement of Computing in Education.

  • Toth, E. E., Morrow, L., & Ludvico, L. R. (2012b). Blended inquiry with hands-on and virtual laboratories: The role of perceptual features during knowledge construction. Interactive Learning Environments. doi:10.1080/10494820.2012.693102.

    Google Scholar 

  • Toth, E. E., Suthers, D. D., & Lesgold, A. (2002). Mapping to know: The effects of representational guidance and reflective assessment on scientific inquiry skills. Science Education, 86(2), 264–286.

    Article  Google Scholar 

  • Triona, L. M., & Klahr, D. (2003). Point and click or grab and heft: Comparing the influence of physical and virtual instructional materials on elementary school students’ ability to design experiments. Cognition and Instruction, 21(2), 149–173.

    Article  Google Scholar 

  • Underwood, J. S., Hoadley, C., Lee, H. S., Hollebrands, K., DiGiano, C., & Renninger, K. A. (2005). IDEA: Identifying design principles in educational applets. Educational Technology Research and Development, 53(2), 99–112.

    Article  Google Scholar 

  • UsNews and World Report (2007). Dennis Quaid’s Newborns Are Given Accidental Overdose: Medical Mistakes Are Not Uncommon in U.S. Hospitals. Nov 1, 2007.

  • Windschitl, M. (2000). Supporting the development of science inquiry skills with special classes of software. Educational Technology Research and Development, 48(2), 81–95.

    Article  Google Scholar 

  • Wolf, T. (2010). Assessing student learning in a virtual laboratory environment. Education, IEEE Transactions, 53(2), 216–222.

    Article  Google Scholar 

  • Zacharia, Z. C. (2007). Comparing and combining real and virtual experimentation: an effort to enhance students’ conceptual understanding of electric circuits. Journal of Computer Assisted learning, 23(2), 120–132.

    Article  Google Scholar 

  • Zacharia, Z. C., & Constantinou, C. P. (2008). Comparing the influence of physical and virtual manipulatives in the context of the physics by inquiry curriculum: The case of undergraduate students’ conceptual understanding of heat and temperature. American Journal of Physics, 76(4), 425–430.

    Article  Google Scholar 

  • Zacharia, Z. C., Manoli, C., Xenofontos, N., de Jong, T., Pedaste, M., van Riesen, S. A. N., et al. (2015). Identifying potential types of guidance for supporting student inquiry when using virtual and remote labs in science: a literature review. Educational Technology Research and Development, 63(2), 257–302.

    Article  Google Scholar 

  • Zacharia, Z. C., Olympiou, G., & Papaevripidou, M. (2008). Effects of experimenting with physical and virtual manipulatives on students’ conceptual understanding in heat and temperature. Journal of Research in Science Teaching, 45(9), 1021–1035.

    Article  Google Scholar 

  • Zimmerman, C. (2007). The development of scientific thinking skills in elementary and middle school. Developmental Review, 27, 172–223.

    Article  Google Scholar 

Download references

Acknowledgments

The research was supported by the NSF—EPSCOR funded, WVnano, a statewide nanotechnology initiative housed at West Virginia University; NSF-0554328, Paul Hill, PI. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. A West Virginia University, Internal Research Fund also supported the work of a research assistant. Stephanie Horner contributed to data coding and to establishing inter-rater reliability. Irene Gladys assisted with research administration tasks and Danielle Erdos-Kramer assisted with the final editing of the manuscript. Dr. Cerasela-Zoica Dinu of West Virginia University was the instructor of classes in both studies. Two anonymous reviewers provided extensive suggestions on earlier versions of the manuscript. The instruction used the MyDNA virtual laboratory that was created by the ‘‘Molecules in Motion’’ program—supported by a grant from the Camille Henry Dreyfus Foundation, Inc. No materials or images from the MyDNA program are used in this manuscript. Some of the images that were used in the study are available at the “Hall of Shame” (http://www.ruf.rice.edu/~bioslabs/studies/sds-page/sdsgoofs.html) that is maintained by Dr. David Caprette of Rice University. No images from this webpage are used in the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eva Erdosne Toth.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Toth, E.E. Analyzing “real-world” anomalous data after experimentation with a virtual laboratory. Education Tech Research Dev 64, 157–173 (2016). https://doi.org/10.1007/s11423-015-9408-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11423-015-9408-3

Keywords

Navigation