Data sources and collection procedures
The three data sources and methods included 26 instructional team meeting summaries, 274 student questionnaires, and 175 instructional class observations.
Instructional team meeting summaries
Throughout each term, the instructional team members met weekly to discuss and make decisions related to course planning, delivery, and adjustments. A protocol was developed to guide note-taking during the meeting for the purpose of documenting team meeting interactions, reports of students’ involvement in TEFA strategies and emerging issues, and meeting outcomes from decisions. These meeting notes were primarily qualitative in nature.
Student questionnaires
At the end of each term, a web-based questionnaire was administered to students anonymously using an online survey delivery system: SurveyMonkey. The five-part questionnaire (measuring participation in TEFA, examining TEFA uses, documenting participant demographics, assessing course experiences, and determining assessment preferences) was developed by the research team with assistance from a measurement expert. It included both quantitative and qualitative questions for a total of 35 items with comments available for the majority.
Class observations
Throughout each term, a class observation guided by a protocol was completed by a GTA who attended each lecture for the dual purposes of documenting emergent issues and class interactions focused on student involvement in TEFA strategies as well as course delivery as a means of assessing instructional fidelity across sections of the same course. The protocol consisted of 14 dichotomous (yes/no) items (e.g., Were opportunities provided for students to use the audience response system?) and two open-ended questions about emergent issues and questions students asked during class. Protocol training was provided and inter-rater reliability was assessed among multiple GTAs during the first two classes until 90% reliability was reached.
Summary of the case
At the pre-term instructional team meeting for the Fall 2013 term, evidence of enthusiasm was apparent in the summary description that all tasks were complete and “course materials and activities were ready to go.” The ARS implementation seemed to be going well during the first few weeks: classroom observations captured an increasing rate of student involvement and interest in the ARS activities, and several team members expressed satisfaction with the overall regularity of student participation they had observed in the classroom. The team had even observed student uses of the information gleaned from the ARS activities and saw the instructors consider how this information could inform instructional changes. For example, the third instructional team meeting summary captured the discussions about students asking to review particular content areas that they had found confusing, which led to the instructional team adding examples to the subsequent lecture notes. By the midpoint of the term, classroom observations and team members reported having reached “a plateau” described as participation consistently estimated to be the majority of students attending class yet no visible increases in rates of participation across more than two classes. Interestingly, concern with student participation became a consistent subject during subsequent instructional team meetings because GTAs and instructors both observed a slight decrease in frequency of participation in the latter half of the term.
The initial very high rate of participation in the practice quizzes (87% completed both) reported by students at the end of the Fall 2013 term was familiar to members of the instructional team. This is because the team had advertised this activity as being helpful for exam preparation and students had already confirmed their participation verbally in class. Whereas some comments were general, stating they “ … liked the practice exams … ,” others explained the students use prior to a summative exam saying, “exam practice [quizzes] [are] excellent study tool[s]” and “It gave me an idea of what types of questions would be asked.” Unanticipated by the instructional team were the anecdotal comments describing the single attempt for each practice quiz as limiting its benefit.
At the post-term meeting for the Fall 2013 term, team members seemed to be surprised by the questionnaire results: students’ high participation rate in audience response technology activities (72.6%) was viewed by the instructional team as encouraging, yet the lower rate of purchase of the stand-alone remote (61.5%) was unexpected. It became clear that students most often cited cost ($40) as a barrier to purchasing the remote. However, this did not seem to impact their participation in the ARS activities. Indeed, several students reported engaging in the ARS activities regardless of whether they were individually able to contribute to the group response using the remote; one student noted, “I learned as much from watching the [ARS] questions as if I had [been] using one.” Also noted in the summary was a lengthy discussion about the contrast between what could be considered the high rate of students’ self-reported participation at the end of the first term with the decreasing pattern of participation observed by instructional team members as well as discussion of how cost might be addressed.
The team members seemed initially skeptical that though the TEFA strategies were viewed as useful for learning, the effect for the ARS was not dependent on participation with remote. The uncertainty was evident in the meeting summaries where the GTAs voiced a concern that students were not buying the remote, “if they don’t buy it how can they learn from it?” The questionnaire results revealed that the vast majority of students during the Fall 2013 term reported using the TEFA strategies foremost for an understanding check of course content (94.4% for ARS and 94.0% for practice quizzes) and for areas of weakness identification for remediation (92.2% for ARS and 96.6% for practice quizzes). This is compared with fewer, yet still more than 80% of respondents agreeing that the TEFA strategies were useful for providing peer comparisons, authentic preparation, and engaging interactions. Further, students attributed their enjoyment of ARS activities to the ability to respond anonymously for guiding personal use of the information (94%); one student expressed it was a “fun way to check my own understanding.” Similarly, from the instructional team perspective, the ARS activities were thought to enable student access to data related to individual understandings of course content that could then be compared with their peers, if desired. Team instructional meeting summaries captured the persistent concern that a potential cumulative effect of the ever-decreasing use during the term had resulted in fewer students using the ARS remote over time and thus hampered the ability for students to compare with classmates. Thus, two members were tasked with seeking less expensive alternatives to the remote.
The questionnaire findings also indicated that timing of the students completing the practice quizzes influenced their perceptions of their usefulness in supporting learning their own learning (96.6%). Whereas the majority of students reported use of the practice quizzes at the end of studying as a final check of their understanding of the material, another group of students used the practice quizzes before beginning to study as a means of focusing their efforts on areas of weakness. Clearly students valued the on-demand aspects of the practice quizzes saying, “by doing it more than once, I can see how I am improving”. Specifically, students attributed not being able to complete the quiz multiple times to a perception of limited usefulness evidenced by a lower rate of participation in the end-of-term practice quiz compared with that offered at the middle of the term (88.9 and 94% respectively). The student’s desire for greater access to the practice quizzes were echoed across the team meeting summaries. Unanimous agreement among the instructional team about the lack of pedagogical reasons to limit access to the practice quizzes led to allowing multiple attempts during the Winter 2014 term.
At the pre-term instructional team meeting for the Winter 2014 term, the GTAs reported that serendipitously, a more-cost-effective, phone-based application using the same system as the remote was now available for a quarter of the cost — just $10. One team member had examined its viability including ease of access and compatibility for use along with the remote and consensus was reached to offer the options in tandem. The ARS implementation across the two platforms (i.e., stand-alone remote and phone-based application) seemed to go really well for the first 2 weeks of class; classroom observations indicated the majority of students attending class were participating. However, following the third class, the classroom observations and team members reported a marked decrease in ARS participation and one of the GTAs was assigned to further investigate. Interestingly, this timing was found to coincide with the end of the free trial of the phone-based application. A further trend that was noted across the classroom observations with some consistency was the increased interest for ARS activities as practicing exam-type items. This lead to discussions at subsequent team meetings about why this might be the case rather than using the questions for an understanding check, and team members were tasked with talking with students informally. The following meeting for one of these instances provided information that students reported primary interest in the ARS activities as a practice to the exam, “I like knowing what the exam questions might look like”.
The initial high rate of participation in the practice quizzes (84% completed both) was reported by students at the end of the Winter 2014 term. These students described the practice quizzes as generally useful for exam preparation and specifically for identifying areas of weakness, yet raised concerns about the lack of access to the correct answers. Representative comments include “I have yet to do the final exam practice quiz, however I do intend to do it prior to the final exam. I just want to review my notes and study before I take it to see what I need to look over again” and, the practice exams “ … guide studying and evaluate which sections of the course [I] needed to revisit”. At the same time, additional LMS features were being introduced that would provide students access to more information than whether the item had been answered correctly or not. One team member was tasked with examining potential applications of those practice quiz features.
At the post-term meeting for the Winter term 2014, the team members reviewed the questionnaire results that initially seemed puzzling related to participation and usefulness of the TEFA strategies. Together, the rates of purchase of the remote and app during the second term remained at similar rates to the first term (61%), yet there was a small decline (five percentage points) in overall participation to 67.9%. A reasonable explanation advanced by the instructional team for the lower rate of purchase (1.2%) of the phone-based application by students was that some students may be used to accessing apps for free or for modest costs and thus be unwilling to pay $10 for the phone-based app. Another difference was the extent of agreement related to the usefulness of the TEFA strategies; particularly for the ARS.
An understanding check of course content remained the most useful for both strategies (90% for ARS and 92.6% for practice quizzes) followed by authentic preparation (83.3% for ARS and 92.6% for practice quizzes). Indeed, the ability to receive personalized feedback immediately was reported as being useful by the vast majority of students (90.0% for ARS and 92.6% for practice quizzes). There was less agreement between the perceived usefulness of strategies such as weakness identification for remediation (92.6% for the practice quizzes compared with 76.7% for ARS). This is compared with fewer, yet still more than 60% of respondents for ARS and 80% of respondents for practice quizzes agreeing that the TEFA strategies were useful for providing peer comparisons and engaging interactions. These results lead to team discussions to consider how the ARS activities could be more engaging. One member took on the task of developing more frequent opportunities for ARS to be used in subsequent lectures.
A new LMS feature allowed team members to give students access to the total number of correct responses for each attempt during Winter 2014 and these effects were discussed at the post-term team meeting. Similar to the preceding terms, students consistently reported a high rate of participation in the practice quizzes for exam preparation either at the beginning of the studying process or to confirm exam readiness. An unanticipated outcome was that these students also desired access to the correct answers, saying, “[t]he practice quizzes would be more beneficial if they had answers.” The team meeting summaries documented the struggle the instructional team experienced while attempting to resolve the dilemma: while students might benefit from immediate access to the answers, students might equally benefit from having to find the answer themselves. Ultimately one team member was tasked with creating a feature allowing students to review their incorrect items using prompts that would ultimately unveil the correct answer.
At the pre-term instructional team meeting for the Fall 2014 term, further discussions ensued, which lead them to offer both ARS platforms and open and on-demand access to the practice quizzes during this term. Through the classroom observations, the team members noted consistent participation in the ARS and practice quizzes and did not note any problems with the compatibility of the platforms or quiz access. The review of questionnaire results at the end of the term showed a small increase in overall participation rates (up seven percentage points from the previous term, to 75%) and purchase rates (up four percentage points from the previous term, to 65.8%). The team members noted surprise that despite the lower cost of the phone-based application ($10), the rate of purchase was higher for stand-alone remotes (44.7%) than it was for the phone-based apps (21.1%). The team members reported similar perceived usefulness of ARS activities across platforms. Indeed, the vast majority of students reported TEFA strategies as most useful for an understanding check of course content (97% for ARS and 93.4% for practice quizzes) and weakness identification for remediation (92.4% for ARS and 93.4% for practice quizzes). Further, more than 80% of respondents agreed that the TEFA strategies were useful for providing peer comparisons, authentic preparation, and engaging interactions. It is also key to note that the vast majority of students found the ability to respond anonymously and receiving immediate feedback from the TEFA strategies as highly useful (97% for ARS and 93.4% for practice quizzes). The final integration across three instructional terms and two TEFA strategies revealed four mixed insights.
Implications and limitations
The mixed insights offer important implications in terms of advancing theory, research, and practice related to TEFA strategies and highlight the contribution that mixed methods approaches can have in advancing educational technology in higher education. First, in terms of theory and research and specifically to how the two strategies examined in this paper (ARS and on-demand practices quizzes) support learning from the perspectives of the instructional team and students. From a theoretical perspective, this study points to educational technology as an important mediator for creating significant classroom-based learning experiences. Specifically, this study points to the effectiveness of the TEFA strategies for supporting student motivation, student use for directing their own learning, and team use for informing instructional adjustments that have effects for students immediately and in the future. The advancements made by Ha and Finkelstein (
2013) are an excellent starting place for further quantitative study of the student perspective of the impact of ARS and its use for informing instructional decisions. It is important not to exclude other types of my consideration of effectiveness in this study, for example while peer interactions are often less timely than ARS or practice quizzes, written peer feedback remains a useful tool for supporting learning (Ion, Barrera-Corominas, & Tomàs-Folch,
2016).
From the examination of the illustrative example of two effective TEFA strategies, it seems that assessing the impacts and influences of educational technology in higher education teaching and learning contexts is also complex and thus theory and research in this area needs to heed the same calls as the field of mixed methods research does more generally. This includes not only continuing to refine theory in light of new and emerging understandings but also to continue to develop new TEFA strategies and ways of capturing the influence and effects of these strategies through the use of mixed methods. Indeed, this reflects the thinking highlighted by Li (
2014) that technology-enhanced learning will need to take into account the interrelated relationships among the learner, the learning context, the technology, and (I would argue) the instructor. The use of a mixed methods research approach to the case study impacted this research in three important ways: First, the integration of qualitative and quantitative data at two points of interface generated rich understandings specific to each term and then the cross-analysis of terms and strategies allowed for new trends and differences to be noted. Second, the integration of multiple perspectives provided access to new understandings that had been previously inaccessible; notably the instructional adjustments and the reasons underpinning the adjustments. Third, the integration of multiple data sources enhanced validity evidence as it allowed for triangulation within the descriptive case summary and, specifically, the qualitative findings provided context for participation trends that had previously not been captured across terms. By capturing the implementation processes over time and across different groups of students rather than simply the outcomes from one term, I offer an illustrative example of an embedded mixed methods case study design reflective of the necessary openness to emerging and innovative ways that are appropriate for the complex conditions under which research is undertaken.
In terms of practice, this study points to three areas to consider when implementing TEFA strategies; frequency of use, cost to students, and ease of on-demand access. Instructors adopting educational technology in their higher education teaching and learning environments need to be aware of the literature guiding implementation of TEFA and the influences of their own assessment experiences on their instructional practices. This is because researchers have established that instructors’ classroom assessment practices are influenced by their attitudes, prior experiences, knowledge, skills, and motivation related to the nature of assessment and learning (Calderhead,
1996). Meaningful learning interactions incorporating educational technology and formative assessments require appropriate structures (Sorensen & Takle,
2005). Thus, more comprehensive understandings related to influences on involvement, effects on learning, accessibility of feedback, and impacts on instructors could influence future implementations of TEFA and thus have important practice consequences and impacts on classroom teaching and learning environments.
To provide guidance to educational technology researchers in higher education, I designed this research with dimensions of high quality research in mixed methods (Collins,
2015) and the criteria for publishing (Onwuegbuzie & Poth
2016; Fetters & Freshwater,
2015) in mind. Based on these standards, this research represents high quality mixed method research in the following five ways: the use of literature to clearly justify a mixed methods approach for generating mixed insights; the detailed descriptions of the procedures to create transparency of the methodology used in the case study; the mixing strategy used to address the convergent purpose underpinning the intentional integration; the procedures and practices for maintaining ethical standards, rigor, and complexity built into the design itself; and validity evidence gathered to support the findings from each strand separately as well as the mixed insights.
The results need to be interpreted with the following two limitations in mind: First, the design is somewhat unique in that it used a convenience concurrent mixed sampling strategy with involvement in the case study because of the relationship between the qualitative data collected from observations and team meetings and the quantitative data collected from student questionnaires. The use of a convenience sample meant that the individuals were available and willing to participate — similar to the concurrent implementation described by Teddlie and Yu (
2007). The data captured the key perspectives for the case study and met the minimum sample size recommendations was met (Collins,
2010). Second, the results of the quantitative strand may be underrepresented in the crossover mixed analysis. For example, in the crossover analysis I chose to focus on items rather than means or relationships at the scale level. This was done because the items had been researcher-created and may have simplified the quantitative perspective through the use of item-level descriptive statistics. Future research could use a similar design with established constructs.