1 Introduction
2 Aims and Research Questions
-
RQ1. How were the digital technology teaching and learning innovations effective in the students’ learning experience?
-
RQ2. How could the innovations be improved, and what are the implications of such design innovations for future course design processes?
3 Literature Review
3.1 Student-Centred Learning (SCL)
3.2 Digital Technologies in Teaching and Learning
3.3 Integrating Technology into Pedagogical Design
-
expand the capacity of instructors
-
enhance the efficiency of educational institutions
-
increase the relevance of the educational system for society
4 Theoretical Framework
MISL instructor facilitatesa | Assumptions of SCL |
---|---|
Access to Information and Learning Resources (AIR) | Learning environment support underlying cognitive processes; Understanding supported when cognitive processes are augmented by technology (SAM) |
Support and Motivation (SAM) | Learners make, or are guided to make effective choices; learning best when varied representations are supported (AIR) |
Participation and Collaboration (PAC) | Individuals assume greater responsibility for their learning; Personal experience; varied learning activities; Individual will assume greater responsibility for their learning; |
Assessment and Feedback (ASF) | Traditional instruction is too narrow to support varied ways of promoting learning; aoptimal learning occurs when varied representations are supported |
Critical Reflection (CRF) | Understanding requires time; understanding most relevant when rooted in personal experience; |
Knowledge Construction (KCO) | Knowledge personally constructed via interpretation and negotiation; understanding evolves continuously |
Faculty | Course | Innovation incorporated into learning design | Element of MISL | ||||
---|---|---|---|---|---|---|---|
Access to information & resources | Support & motivation | Participation & collaboration | Assessment & feedback | Knowledge construction | |||
Architecture, landscape and the visual arts | A1 | Peer and UC feedback | ✓ | ✓ | ✓ | ✓ | ✓ |
A2 | Building an online community | ✓ | ✓ | ✓ | ✓ | x | |
Business | B1 | Visualising course | ✓ | x | x | X | ✓ |
B2 | Embedded online annotated lectures | ✓ | ✓ | x | ✓ | x | |
Engineering, computing and mathematics | E1 | Online quizzes | ✓ | x | ✓ | X | x |
E2 | Interactive Video | ✓ | ✓ | x | ✓ | ✓ | |
Science | S1 | Online-quizzes | ✓ | x | x | ✓ | ✓ |
Flipped class | X | ✓ | ✓ | ✓ | ✓ | ||
S2 | Pre-lab video | ✓ | ✓ | ✓ | X | ✓ | |
Education | E1 | Pre-seminar quizzes | ✓ | ✓ | x | ✓ | ✓ |
E2 | Online Lectures | ✓ | x | ✓ | ✓ | x | |
Arts | A1 | Group project | ✓ | ✓ | ✓ | ✓ | ✓ |
Critical self-reflection | X | x | x | X | ✓ |
5 Methodology
5.1 Research Design Process and Instrument
5.2 Sampling Process and Data Collection
Faculty | Course | Number of students enrolled | Undergraduate (U) or post-graduate (P) | Students responding to questionnaire no. (%) | SURF respondents no. (%) |
---|---|---|---|---|---|
Architecture, landscape and the visual arts | A1 | 12 | P | 4 (33) | 6(50) |
A2 | 108 | U | 62 (57) | 41(38) | |
Business | B1 | 317 | U | 99 (31) | 117(37) |
B2 | 262 | U | 39 (15) | 84(32) | |
Engineering, computing and mathematics | E1 | 136 | U | 57 (42) | 54(40) |
Science | S1 | 107 | U | 50 (47) | 41(38) |
S2 | 302 | U | 119 (39) | 82(27) | |
Education | E1 | 28 | P | 21(75) | 15 (52) |
Arts | A1 | 137 | U | 101 (74) | 59(43) |
Total | 1409 | 552 (39) | 499 (35) |
Research questions for study | Data sources | ||||
---|---|---|---|---|---|
Course coordinators | Students | SURF (n = 499) | |||
Interview 1 (n = 11) | Interview 2 (n = 9) | Closed items (n = 552) | Open items (n = 552) | ||
1. Which revised designs were implemented? | ✓ | x | ✓ | X | x |
2. How did the revised designs affect the student learning experience? | x | ✓ | ✓ | ✓ | ✓ |
3. How might the revised designs be improved? | x | ✓ | x | ✓ | x |
4. What are the implications for the course design process? | ✓ | ✓ | x | ✓ | x |
5.3 Analysis
MISL | Data source | Overall | |||
---|---|---|---|---|---|
Course Coordinator (CC) | Student Questionnaire | ||||
Interview 1 (Expected) | Interview 2 (Observed) | Closed items | Open items | ||
Access to information and learning resources | ✓ | ✓✓ | ✓✓ | ✓ | ✓✓ |
Support and motivation | ✓✓ | ✓✓ | ✓✓ | ✓✓✓ | ✓✓ |
Participation and collaboration | ✓✓✓ | ✓✓ | ✓✓ | ✓ | ✓✓ |
Assessment and feedback | ✓ | ✓✓✓✓ | ✓✓ | ✓✓✓ | ✓✓✓ |
Knowledge construction | ✓ | ✓ | ✓✓ | ✓ | ✓ |
6 Results
6.1 Pre-implementation Interview with Course Coordinators—RQ1
6.2 Post-implementation Interview with Course Coordinators—(RQ2)
These further observations were made by the course coordinators about specific activities: quizzes proved to be an effective replacement for a mid-semester exam; insights were gained into why some students failed to complete online quizzes; quizzes helped students keep up with their reading commitments; and online activities effectively supplemented in-class learning activities. A symbolic summary of the impact of the re-designs implemented, on the five elements of the model, is presented in Table 5. Overall, course coordinators considered that the learning experience element most positively impacted by the redesigns was assessment and feedback. Hence, we attribute the overall effectiveness of student-centred learning to course coordinators’ ability to provide students concise and accurate feedback to queries they may have to clarify their understanding of constructs or topics.Judging from the students’ comments in the survey, I think the redesign largely worked, but some tweaking needs to be done around the use of SPARK, including the weighting of the final SPARK (an online peer assessment tool) in week 13, and how the three SPARK scores are used to generate a grade that more clearly separates process from product in the design project. The other noticeable benefit of the changes was that the general quality of the design projects has improved, with most students better able to recognise the importance of the unit’s focus on hard-copy interactive designs. (course coordinator-CM-1)The responses to all aspects of the student experience were highly positive, in keeping with the normal level of student satisfaction in this unit. Next time I will link lectures, activities, and quizzes more explicitly for students. (course coordinator-M/SE-4)
-
provide clearer directions to students on the use of social media
-
improve team building and group work
-
identify and assist specific students with digital technologies
-
integrate online activities
-
adapt in-class activities and assessments
-
reduce opportunities for collusion on quizzes
-
make quizzes more enjoyable
-
establish themes to provide more coherence in the course topics
-
investigate why students did not attend the flipped classes
6.3 The Student Responses
The overall survey response to Research Questions showed significant students’ satisfaction in all categories of the model, with mean scores greater than 3.0 (Table 6).The flipped lectures were a great way to learn and understand the content of the unit. It meant I was more inclined to watch the lectures and it gave me the opportunity to fully understand it in the class discussions. (SB-12)The pre-lab activities were very helpful in preparing for the laboratory and review workshops and it was great to have two attempts and be able to see where I went wrong between attempts. Labs have also been more enjoyable/less stressful where I can just focus on completing the experiment and then can take the lab sheet home to complete. (SChe-6)
Maths education | Science education | Media studies | First year chemistryb | Electronic materialb | Landscape and visual arts | Marketing researchb | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Mean | SD | Mean | SD | Mean | SD | Mean | SD | Mean | SD | Mean | SD | Mean | SD | |
N = 21 | N = 21 | N = 100 | N = 116 | N = 44 | N = 49 | N = 99 | ||||||||
AIRa | 4.9 | 0.30 | 4.88 | 0.33 | 3.34 | 0.95 | 3.73 | 1.23 | 3.43 | 1.01 | 3.96 | 0.71 | 4.35 | 0.82 |
SAMa | 4.9 | 0.31 | 4.76 | 0.54 | 3.70 | 1.02 | 3.49 | 1.07 | 3.46 | 1.16 | 3.94 | 0.83 | 4.26 | 0.79 |
PACa | 5.0 | 0.19 | 4.76 | 0.56 | 3.85 | 0.90 | 3.69 | 1.32 | 3.57 | 1.00 | 3.71 | 1.15 | 4.11 | 0.85 |
ASFa | 4.8 | 0.46 | 4.65 | 0.87 | 3.59 | 0.93 | 3.67 | 1.05 | 3.18 | 1.27 | 3.58 | 1.05 | 4.07 | 1.01 |
REFa | 4.7 | 0.45 | 4.58 | 0.59 | 3.32 | 0.94 | 3.34 | 1.14 | – | – | 3.61 | 1.08 | 3.83 | 0.78 |
KNCa | – | – | – | – | 3.84 | 0.97 | 3.40 | 1.22 | 2.82 | 1.09 | 3.18 | 0.79 | 3.09 | 1.07 |
INDP | 4.85 | 0.31 | 4.73 | 0.46 | 3.59 | 0.92 | 4.14 | 1.05 | 3.20 | 1.05 | 3.77 | 0.88 | 4.09 | 0.80 |
DTEC | 3.65 | 0.30 | 3.86 | 0.96 | 4.07 | 0.70 | 3.36 | 1.38 | 3.91 | 0.84 | 4.24 | 0.78 | 3.77 | 0.96 |
The two aspects of the learning experience that students were most dissatisfied with were access to information and resource, and support, motivation and knowledge construction. Other areas of concern were technical difficulties with some digital resources or websites, digital learning resources being of an inferior standard, and lack of alignment of the course learning outcomes with its learning activities and assessment. Some negative feedback from individual students’ experiences on their courses (normally 1 or 2) appeared to be demoralising to some of the course coordinators who appeared to have given their best. It was our view that course coordinators took a wholistic view of the responses in light of the entire class, effectiveness of their teaching philosophies and how the learning outcomes aligned with the assessments and learning activities. In two of the courses (Biology and Education), it was established that, by clearly separating and describing learning resources that helped the students to directly or indirectly achieve the learning outcomes, partially achieve the learning outcomes, and resources that provided additional information, the students expressed high satisfaction of their experience.Multiple choice questions are good for understanding the content, but to test their understanding of the content it’s important that students get exposed to exam level questions. (SE-5)I feel as if this may not be necessarily relevant to students’ future careers. While games are fun and there are game mechanics and functions of play within all forms of life, this doesn’t necessarily constitute a reason to base a whole university unit on them. (SC-15)
7 Discussion
7.1 Effect of New Designs on Student Learning Experience
8 Implications
8.1 Course Design Process
-
an association between course redesign by course coordinators and their earlier participation in learning design workshops
-
a broad range of initiatives, including online activities (quiz, video) that enable a flipped approach and the use of social media for collaboration and online peer feedback
-
a high level of satisfaction by course coordinators with changes made to their learning designs
-
reservations by course coordinators about the capacity for some students to engage in a digital learning environment, and for the level of attendance of students at face-to-face classes
-
design elements targeted by course coordinators for improvement in the next iteration of their course were improved communication with students, constructive alignment, and better use of digital media
-
a positive response by students to their learning experience, particularly for improved access to information and resources, support and motivation, and assessment and feedback
-
lower levels of satisfaction expressed by students in relation to knowledge construction and participation and collaboration
-
workload issues that may arise from effective online teaching would require effective management of such workload