Skip to main content
Top
Published in: Education and Information Technologies 4/2015

01-12-2015

Technology enhanced assessment in complex collaborative settings

Authors: Mary Webb, David Gibson

Published in: Education and Information Technologies | Issue 4/2015

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Building upon discussions by the Assessment Working Group at EDUsummIT 2013, this article reviews recent developments in technology enabled assessments of collaborative problem solving in order to point out where computerised assessments are particularly useful (and where non-computerised assessments need to be retained or developed) while assuring that the purposes and designs are transparent and empowering for teachers and learners. Technology enabled assessments of higher order critical thinking in a collaborative social context can provide data about the actions, communications and products created by a learner in a designed task space. Principled assessment design is required in order for such a space to provide trustworthy evidence of learning, and the design must incorporate and take account of the engagement of the audiences for the assessment as well as vary with the purposes and contexts of the assessment. Technology enhanced assessment enables in-depth unobtrusive documentation or ‘quiet assessment’ of the many layers and dynamics of authentic performance and allows greater flexibility and dynamic interactions in and among the design features. Most important for assessment FOR learning, are interactive features that allow the learner to turn up or down the intensity, amount and sharpness of the information needed for self-absorption and adoption of the feedback. Most important in assessment OF learning, are features that compare the learner with external standards of performance. Most important in assessment AS learning, are features that allow multiple performances and a wide array of affordances for authentic action, communication and the production of artefacts.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literature
go back to reference Anderson, D., Nashon, S., & Thomas, G. (2009). Evolution of research methods for probing and understanding metacognition. Research in Science Education, 39(2), 181–195.CrossRef Anderson, D., Nashon, S., & Thomas, G. (2009). Evolution of research methods for probing and understanding metacognition. Research in Science Education, 39(2), 181–195.CrossRef
go back to reference Baker, R. S. J. (2010). Data mining for education. International Encyclopedia of Education, 3, 112–118.CrossRef Baker, R. S. J. (2010). Data mining for education. International Encyclopedia of Education, 3, 112–118.CrossRef
go back to reference Bell, T., Urhahne, D., Schanze, S., & Ploetzner, R. (2009). Collaborative inquiry learning: models, tools, and challenges. International Journal of Science Education, 32(3), 349–377.CrossRef Bell, T., Urhahne, D., Schanze, S., & Ploetzner, R. (2009). Collaborative inquiry learning: models, tools, and challenges. International Journal of Science Education, 32(3), 349–377.CrossRef
go back to reference Black, P., & Wiliam, D. (1998). Inside the Black Box: Raising Standards Through Classroom Assessment: King’s College. Black, P., & Wiliam, D. (1998). Inside the Black Box: Raising Standards Through Classroom Assessment: King’s College.
go back to reference Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2002). Working inside the black box: assessment for learning in the classroom. London: King’s College, London, Department of Education & Professional Studies. Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2002). Working inside the black box: assessment for learning in the classroom. London: King’s College, London, Department of Education & Professional Studies.
go back to reference Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2003). Assessment for learning: putting it into practice. Buckingham: Open University. Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2003). Assessment for learning: putting it into practice. Buckingham: Open University.
go back to reference Black, P., Harrison, C., Hodgen, J., Marshall, B., & Serret, N. (2010). Validity in teachers’ summative assessments. Assessment in Education: Principles, Policy & Practice, 17(2), 215–232.CrossRef Black, P., Harrison, C., Hodgen, J., Marshall, B., & Serret, N. (2010). Validity in teachers’ summative assessments. Assessment in Education: Principles, Policy & Practice, 17(2), 215–232.CrossRef
go back to reference Blatchford, P., Baines, E., Rubie-Davies, C., Bassett, P., & Chowne, A. (2006). The effect of a new approach to group work on pupil-pupil and teacher-pupil interactions. Journal of Educational Psychology, 98(4), 750–765.CrossRef Blatchford, P., Baines, E., Rubie-Davies, C., Bassett, P., & Chowne, A. (2006). The effect of a new approach to group work on pupil-pupil and teacher-pupil interactions. Journal of Educational Psychology, 98(4), 750–765.CrossRef
go back to reference Bloom, B. S., Englehart, M. B., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives, the classification of educational goals - handbook I: cognitive domain. New York: McKay. Bloom, B. S., Englehart, M. B., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives, the classification of educational goals - handbook I: cognitive domain. New York: McKay.
go back to reference Boekaerts, M., & Cascallar, E. (2006). How far have we moved toward the integration of theory and practice in self-regulation? Educational Psychology Review, 18(3), 199–210.CrossRef Boekaerts, M., & Cascallar, E. (2006). How far have we moved toward the integration of theory and practice in self-regulation? Educational Psychology Review, 18(3), 199–210.CrossRef
go back to reference Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience and school. Washington: National Academy Press. Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience and school. Washington: National Academy Press.
go back to reference Brown, N. J. S. (2005). The multidimensional measure of conceptual complexity. Berkeley: Bear Centre. Brown, N. J. S. (2005). The multidimensional measure of conceptual complexity. Berkeley: Bear Centre.
go back to reference Chan, C. K. (2012). Co-regulation of learning in computer-supported collaborative learning environments: a discussion. Metacognition and Learning, 7(1), 63–73.CrossRef Chan, C. K. (2012). Co-regulation of learning in computer-supported collaborative learning environments: a discussion. Metacognition and Learning, 7(1), 63–73.CrossRef
go back to reference Chauncey, A., & Azevedo, R. (2010). Emotions and motivation on performance during multimedia learning: How do I feel and Why Do I care? In V. Aleven, J. Kay, & J. Mostow (Eds.), Intelligent tutoring systems (Vol. 6094, pp. 369–378). Berlin: Springer.CrossRef Chauncey, A., & Azevedo, R. (2010). Emotions and motivation on performance during multimedia learning: How do I feel and Why Do I care? In V. Aleven, J. Kay, & J. Mostow (Eds.), Intelligent tutoring systems (Vol. 6094, pp. 369–378). Berlin: Springer.CrossRef
go back to reference Clark, D., Sampson, V., Weinberger, A., & Erkens, G. (2007). Analytic frameworks for assessing dialogic argumentation in online learning environments. Educational Psychology Review, 19(3), 343–374.CrossRef Clark, D., Sampson, V., Weinberger, A., & Erkens, G. (2007). Analytic frameworks for assessing dialogic argumentation in online learning environments. Educational Psychology Review, 19(3), 343–374.CrossRef
go back to reference Clarke-Midura, J., Code, J., Dede, C., Mayrath, M., & Zap, N. (2012). Thinking outside the bubble: Virtual performance assessments for measuring complex learning. Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research, 125–148. Clarke-Midura, J., Code, J., Dede, C., Mayrath, M., & Zap, N. (2012). Thinking outside the bubble: Virtual performance assessments for measuring complex learning. Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research, 125–148.
go back to reference Crooks, T. J., Kane, M. T., & Cohen, A. S. (1996). Threats to the valid use of assessments. Assessment in Education: Principles, Policy & Practice, 3(3), 265–285.CrossRef Crooks, T. J., Kane, M. T., & Cohen, A. S. (1996). Threats to the valid use of assessments. Assessment in Education: Principles, Policy & Practice, 3(3), 265–285.CrossRef
go back to reference Davis, E. A. (2000). Scaffolding students” knowledge integration: prompts for reflection in KIE. International Journal of Science Education, 22(8), 819–837.CrossRef Davis, E. A. (2000). Scaffolding students” knowledge integration: prompts for reflection in KIE. International Journal of Science Education, 22(8), 819–837.CrossRef
go back to reference Eurydice, A. (2011). Science Education in Europe: National Policies, Practices and Research. Eurydice, A. (2011). Science Education in Europe: National Policies, Practices and Research.
go back to reference Evagorou, M., & Osborne, J. (2009). ICT in Teaching and Learning. The Hague: EDUSUMMIT. Evagorou, M., & Osborne, J. (2009). ICT in Teaching and Learning. The Hague: EDUSUMMIT.
go back to reference Evagorou, M., & Osborne, J. (2013). Exploring young students” collaborative argumentation within a socioscientific issue. Journal of Research in Science Teaching, 50(2), 209–237.CrossRef Evagorou, M., & Osborne, J. (2013). Exploring young students” collaborative argumentation within a socioscientific issue. Journal of Research in Science Teaching, 50(2), 209–237.CrossRef
go back to reference Franklin, S., & Graesser, A. (1997). Is it an Agent, or Just a Program?: A Taxonomy for Autonomous Agents. Paper presented at the Proceedings of the Workshop on Intelligent Agents III, Agent Theories, Architectures, and Languages. Franklin, S., & Graesser, A. (1997). Is it an Agent, or Just a Program?: A Taxonomy for Autonomous Agents. Paper presented at the Proceedings of the Workshop on Intelligent Agents III, Agent Theories, Architectures, and Languages.
go back to reference Funke, J. (1998). Computer-based testing and training with scenarios from complex problem-solving research: advantages and disadvantages. International Journal of Selection and Assessment, 6(2), 90–96.MathSciNetCrossRef Funke, J. (1998). Computer-based testing and training with scenarios from complex problem-solving research: advantages and disadvantages. International Journal of Selection and Assessment, 6(2), 90–96.MathSciNetCrossRef
go back to reference Gibson, D., & Clarke-Midura, J. (2013). Some psychometric and design implications of game-based learning analytics. In D. Ifenthaler, J. Spector, P. Isaias, & D. Sampson (Eds.), E-learning systems, environments and approaches: Theory and implementation. London: Springer. Gibson, D., & Clarke-Midura, J. (2013). Some psychometric and design implications of game-based learning analytics. In D. Ifenthaler, J. Spector, P. Isaias, & D. Sampson (Eds.), E-learning systems, environments and approaches: Theory and implementation. London: Springer.
go back to reference Goleman, D. (1995). Emotional intelligence. New York: Bantam Dell. Goleman, D. (1995). Emotional intelligence. New York: Bantam Dell.
go back to reference Harlen, W., & Deakin Crick, R. (2002). A systematic review of the impact of summative assessment and tests on students” motivation for learning. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London. Harlen, W., & Deakin Crick, R. (2002). A systematic review of the impact of summative assessment and tests on students” motivation for learning. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.
go back to reference Hattie, J. A. C. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. Abingdon: Routledge. Hattie, J. A. C. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. Abingdon: Routledge.
go back to reference Hickey, D. T., & Zuiker, S. J. (2012). Multilevel assessment for discourse, understanding, and achievement. Journal of the Learning Sciences, 21(4), 522–582.CrossRef Hickey, D. T., & Zuiker, S. J. (2012). Multilevel assessment for discourse, understanding, and achievement. Journal of the Learning Sciences, 21(4), 522–582.CrossRef
go back to reference Jacob-Israel, M., & Moorefield-Lang, H. M. (2013). Redefining technology in libraries and schools: AASL best apps, best websites, and the SAMR model. Teacher Librarian, 42(2), 16–19. Jacob-Israel, M., & Moorefield-Lang, H. M. (2013). Redefining technology in libraries and schools: AASL best apps, best websites, and the SAMR model. Teacher Librarian, 42(2), 16–19.
go back to reference Järvelä, S., Volet, S., & Järvenojä, H. (2010). Research on motivation in collaborative learning: moving beyond the cognitive–situative divide and combining individual and social processes. Educational Psychologist, 45(1), 15–27.CrossRef Järvelä, S., Volet, S., & Järvenojä, H. (2010). Research on motivation in collaborative learning: moving beyond the cognitive–situative divide and combining individual and social processes. Educational Psychologist, 45(1), 15–27.CrossRef
go back to reference Järvenoja, H., & Järvelä, S. (2009). Emotion control in collaborative learning situations: do students regulate emotions evoked by social challenges. British Journal of Educational Psychology, 79(3), 463–481.CrossRef Järvenoja, H., & Järvelä, S. (2009). Emotion control in collaborative learning situations: do students regulate emotions evoked by social challenges. British Journal of Educational Psychology, 79(3), 463–481.CrossRef
go back to reference Johnson, D. W., Johnson, R. T., & Stanne, M. B. (2000). Co-operative learning methods: A meta-analysis. Minneapolis: University of Minnesota. Johnson, D. W., Johnson, R. T., & Stanne, M. B. (2000). Co-operative learning methods: A meta-analysis. Minneapolis: University of Minnesota.
go back to reference Kay, K., & Greenhill, V. (2011). Twenty-first century students need 21st century skills Bringing schools into the 21st century (pp. 41–65): Springer. Kay, K., & Greenhill, V. (2011). Twenty-first century students need 21st century skills Bringing schools into the 21st century (pp. 41–65): Springer.
go back to reference Lee, H.-S., Linn, M. C., Varma, K., & Liu, O. L. (2010). How do technology-enhanced inquiry science units impact classroom learning? Journal of Research in Science Teaching, 47(1), 71–90.CrossRef Lee, H.-S., Linn, M. C., Varma, K., & Liu, O. L. (2010). How do technology-enhanced inquiry science units impact classroom learning? Journal of Research in Science Teaching, 47(1), 71–90.CrossRef
go back to reference Manlove, S., Lazonder, A., & Jong, T. (2007). Software scaffolds to promote regulation during scientific inquiry learning. Metacognition and Learning, 2(2–3), 141–155.CrossRef Manlove, S., Lazonder, A., & Jong, T. (2007). Software scaffolds to promote regulation during scientific inquiry learning. Metacognition and Learning, 2(2–3), 141–155.CrossRef
go back to reference Mansell, W., James, M., Group, A. R., & Newton, P. (2009). Assessment in schools. Fit for purpose? a commentary by the teaching and learning research programme. London: Economic and Social Research Council:Teaching and Learning Research Programme. Mansell, W., James, M., Group, A. R., & Newton, P. (2009). Assessment in schools. Fit for purpose? a commentary by the teaching and learning research programme. London: Economic and Social Research Council:Teaching and Learning Research Programme.
go back to reference Marcovitz, D., & Janiszewski, N. (2015). Technology, models, and 21st-century learning: How models, standards, and theories make learning powerful. In D. Slykhuis & G. Marks (Eds.), Society for information technology & teacher education international conference 2015 (pp. 1227–1232). Las Vegas: Association for the Advancement of Computing in Education (AACE). Marcovitz, D., & Janiszewski, N. (2015). Technology, models, and 21st-century learning: How models, standards, and theories make learning powerful. In D. Slykhuis & G. Marks (Eds.), Society for information technology & teacher education international conference 2015 (pp. 1227–1232). Las Vegas: Association for the Advancement of Computing in Education (AACE).
go back to reference Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13–23.CrossRef Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13–23.CrossRef
go back to reference Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessment. Measurement: Interdisciplinary Research and Perspective, 1(1), 3–62. Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessment. Measurement: Interdisciplinary Research and Perspective, 1(1), 3–62.
go back to reference Pellegrino, J., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: Committee on the Foundations of Assessment, Board on Testing and Assessment, Center for Education, National Research Council. Pellegrino, J., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: Committee on the Foundations of Assessment, Board on Testing and Assessment, Center for Education, National Research Council.
go back to reference PISA. (2013). PISA 2015 Draft Collaborative Problem Solving Framework: Organisation for Economic Co-operation and Development (OECD). PISA. (2013). PISA 2015 Draft Collaborative Problem Solving Framework: Organisation for Economic Co-operation and Development (OECD).
go back to reference Rupp, A.A., Gushta, M., Mislevy, R.J., & Shaffer, D.W. (2010). Evidence-centered design of epistemic games: Measurement principles for complex learning environments. The Journal of Technology, Learning, and Assessment Volume, 8(4). Rupp, A.A., Gushta, M., Mislevy, R.J., & Shaffer, D.W. (2010). Evidence-centered design of epistemic games: Measurement principles for complex learning environments. The Journal of Technology, Learning, and Assessment Volume, 8(4).
go back to reference Sandi‐Urena, S., Cooper, M. M., & Stevens, R. H. (2010). Enhancement of metacognition use and awareness by means of a collaborative intervention. International Journal of Science Education, 33(3), 323–340.CrossRef Sandi‐Urena, S., Cooper, M. M., & Stevens, R. H. (2010). Enhancement of metacognition use and awareness by means of a collaborative intervention. International Journal of Science Education, 33(3), 323–340.CrossRef
go back to reference Shaffer, D. W., Hatfield, D., Svarovsky, G. N., Nash, P., Nulty, A., Bagley, E., et al. (2009). Epistemic network analysis: a prototype for 21st-century assessment of learning. International Journal of Learning and Media, 1(2), 33–53.CrossRef Shaffer, D. W., Hatfield, D., Svarovsky, G. N., Nash, P., Nulty, A., Bagley, E., et al. (2009). Epistemic network analysis: a prototype for 21st-century assessment of learning. International Journal of Learning and Media, 1(2), 33–53.CrossRef
go back to reference Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. In S. Tobias & J. D. Fletcher (Eds.), Computer games and instruction (pp. 503–524). Charlotte: Information Age Publishers. Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. In S. Tobias & J. D. Fletcher (Eds.), Computer games and instruction (pp. 503–524). Charlotte: Information Age Publishers.
go back to reference Ucan, S., & Webb, M. E. (2015). Social regulation of learning during collaborative inquiry learning in science: How does it emerge and what are its functions? International Journal of Science Education (in press). Ucan, S., & Webb, M. E. (2015). Social regulation of learning during collaborative inquiry learning in science: How does it emerge and what are its functions? International Journal of Science Education (in press).
go back to reference Verbert, K., Duval, E., et al. (2013). Learning analytics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509.CrossRef Verbert, K., Duval, E., et al. (2013). Learning analytics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509.CrossRef
go back to reference Voogt, J., Erstad, O., Dede, C., & Mishra, P. (2013). Challenges to learning and schooling in the digital networked world of the 21st century. Journal of Computer Assisted Learning, 29(5), 403–413.CrossRef Voogt, J., Erstad, O., Dede, C., & Mishra, P. (2013). Challenges to learning and schooling in the digital networked world of the 21st century. Journal of Computer Assisted Learning, 29(5), 403–413.CrossRef
go back to reference Webb, M. E., & Jones, J. (2009). Exploring tensions in developing assessment for learning. Assessment in Education: Principles, Policy & Practice, 16(2), 165–184.CrossRef Webb, M. E., & Jones, J. (2009). Exploring tensions in developing assessment for learning. Assessment in Education: Principles, Policy & Practice, 16(2), 165–184.CrossRef
go back to reference Webb, M., E., Gibson, D., & Forkosh-Baruch, A. (2013). Challenges for information technology supporting educational assessment. Journal of Computer Assisted Learning, 29(5), 451–462. Webb, M., E., Gibson, D., & Forkosh-Baruch, A. (2013). Challenges for information technology supporting educational assessment. Journal of Computer Assisted Learning, 29(5), 451–462.
go back to reference Weller, J. (2001). Building validity and reliability into classroom tests. NASSP Bulletin, 85(622), 32–37.CrossRef Weller, J. (2001). Building validity and reliability into classroom tests. NASSP Bulletin, 85(622), 32–37.CrossRef
go back to reference Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37(1), 3–14.CrossRef Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37(1), 3–14.CrossRef
Metadata
Title
Technology enhanced assessment in complex collaborative settings
Authors
Mary Webb
David Gibson
Publication date
01-12-2015
Publisher
Springer US
Published in
Education and Information Technologies / Issue 4/2015
Print ISSN: 1360-2357
Electronic ISSN: 1573-7608
DOI
https://doi.org/10.1007/s10639-015-9413-5

Other articles of this Issue 4/2015

Education and Information Technologies 4/2015 Go to the issue

Premium Partner