Cognitive scientists and assessment developers have long been concerned with creating comprehensive, authentic measures–especially which elicit evidence of proficiency on one or more constructs under conditions of focus and engagement of test takers reflecting their true performance level. This challenge is particularly arduous for complex constructs, including 21
century skills, that can be highly contextualized and involve the interplay of multiple skills. The current work describes the recent development and evaluation of a game-based assessment on argumentation skills, called Mars Generation One (MGO). Our results show that the in-game process data can substantially improve the measurement of argumentation compared to non-interactive multiple-choice tests. Lastly, students’ show high levels of engagement and improve their argumentation skills during gameplay.