Skip to main content
Top
Published in: International Journal of Educational Technology in Higher Education 1/2021

Open Access 01-12-2021 | Research article

What are the key themes associated with the positive learning experience in MOOCs? An empirical investigation of learners’ ratings and reviews

Authors: Ruiqi Deng, Pierre Benckendorff

Published in: International Journal of Educational Technology in Higher Education | Issue 1/2021

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Past MOOC research has tended to focus on learning outcomes valued in traditional higher education settings, such as achievement and retention. This study recognises that student ratings are an important alternative outcome measure in MOOCs. This paper adopted a semiautomatic text mining approach to collect and analyse 8475 ratings and reviews submitted for 1794 MOOCs. The analysis revealed six important themes that contributed to positive ratings: ‘learning’, ‘understanding’, ‘interesting’, ‘videos’, ‘recommend’, and ‘questions’. The paper then investigated the characteristics of each identified theme based on the proximity of themes, distribution of concepts within themes, and important connections. Based on research findings, the paper presents the following propositions to assist educators and providers to enhance the learning experience in MOOCs: (1) provide realistic learning contexts and instructional conditions in MOOCs to facilitate the acquisition of knowledge that transfers more readily to real-world practices; (2) carefully design the instructional conditions so that some mental challenge and stimulation is required for learners to achieve a full understanding of the content, rather than making MOOCs too simple or effortless to complete; (3) design the course content, materials, and communications to generate interest; (4) allocate sufficient resources to create high-quality video lectures; (5) employ video lectures to elicit positive emotions from MOOC learners and simplify complex, difficult concepts; and (6) incorporate discussion boards in MOOCs and invest in human and digital resources to address learners’ queries.
Notes

Supplementary Information

The online version contains supplementary material available at https://​doi.​org/​10.​1186/​s41239-021-00244-3.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Introduction

The disruptive potential of MOOCs has generated increasing scholarly interest in MOOC learning outcomes, particularly learners’ academic performance (Meek, Blakemore, & Marks, 2016; Phan, McNeil, & Robin, 2016) and completion (Engle, Mankoff, & Carbrey, 2015; Hone & El Said, 2016). While academic performance and completion are important measures of learning outcomes in traditional higher education institutions (HEI) settings (Entwistle, 2018), it could be argued that these outcome measures are less relevant to the MOOC teaching and learning context. MOOC learners display a combination of intrinsic, extrinsic, and social motivations when registering for a MOOC (Deng, Benckendorff, & Gannaway, 2017, 2019). Only a small percentage of enrolled learners intend to complete a MOOC and earn a certificate of accomplishment (Evans & Baker, 2016). The majority of MOOC learners are not motivated by educational advancement, such as gaining entry to a postsecondary degree programme (Stich & Reeves, 2017). They tend to be more concerned with achieving personal learning goals than academic achievement (Jung, Kim, Yoon, Park, & Oakley, 2019). Furthermore, learner engagement in MOOCs represents a relatively simple structure when compared with engagement in traditional, credit-bearing university courses (Deng, Benckendorff, & Gannaway, 2020a). These differences draw attention to other measures for evaluating the performance of MOOCs.
Student ratings, or students’ perceived quality of instruction, have been employed by HEIs for decades to provide feedback to enhance the quality of teaching, help students to make enrolment decisions, and facilitate administrative evaluations for promotion and tenure decisions (Cohen, 1980). Similarly, student ratings of MOOCs provide feedback to construct or revamp a MOOC, help prospective learners to select courses for study, and assist university leaders and managers to make investment decisions about future MOOC production. Despite its importance, limited effort has been devoted to analysing and interpreting student ratings and accompanying written feedback on MOOCs. This apparent knowledge gap provides the impetus for the current study. An empirical investigation of student ratings and reviews will contribute to an improved understanding of the study experience in MOOCs. This study limits the scope to the positive study experience and seeks to achieve two research objectives. First, this study aims to identify the key themes contributing positive learning experiences in MOOCs. Second, the study aims to explore the characteristics of each identified theme.

Literature review

The role of students in constructing positive learning experience

From a ‘students-as-customers’ perspective, Higher Education (HE) services are viewed as commodities which are purchased and consumed by students, and students expect HEIs to provide superior service across the student lifecycle (Bunce, Baird, & Jones, 2017). While supporters of such an approach claim that students are beneficiaries and end users of the experiential service offered by HEIs, opponents contend that this approach contradicts to the notion of professionalism (Olssen & Peters, 2005) and leads to lower academic standards (Hassel & Lourey, 2005; Johnson, 2003). Despite the polarised debate, empirical research reveals that university students often see themselves as customers when it comes to student feedback and classroom study, but do not expect to be treated as customers in terms of curriculum design, rigour, classroom behaviour, and graduation (Riina & Petri, 2015).
A more recent paradigm in HE is to view students as ‘co-creators’ (Bovill, Cook‐Sather, & Felten, 2011), ‘co-producers’ (McCulloch, 2009), or ‘partners’ (Bovill, 2020) in constructing positive educational experiences. Under the partnership model, the purpose of student feedback and evaluation is gradually shifting away from rewarding and punishing academics to improving teaching practices (Golding & Adam, 2016) and establishing and recalibrating benchmarks for quality teaching and learning (Chen & Hoshower, 2003; Clayson & Haley, 2005). Student ratings have long been recognised as a useful benchmark for evaluating the performance of HEIs at both the subject/unit level (Stewart, Speldewinde, & Ford, 2018; Walker & Palmer, 2011) and the programme/institution level (Betz, Klingensmith, & Menne, 1970; Landrum, Hood, & McAdams, 2001). The scope of discussion in this article is limited to student ratings at the subject level. This is because traditional educational attributes such as campus climate and physical infrastructure are irrelevant to the MOOC context. This study endorses the ‘students-as-cocreators’ perspective and recognises learners’ voices in constructing positive MOOC study experiences. This perspective implies that student ratings and reviews are valuable sources that can be used to improve the quality of learning experience in MOOCs.

Higher education student ratings and reviews

There is little consensus about how student rating scales are best conceptualised and constructed in the HE context. A review of student rating instruments shows that some HE researchers have treated student ratings as a multidimensional concept (Harnash-Glezer & Meyer, 1991; Hendry, 1983; Krehbiel, McClure, & Pratsini, 1997). Adopting a multi-attribute model, scales are divided into a range of distinct characteristics and assessed using Likert-type or rating scales. Some researchers have attempted to simplify student ratings by adopting a more holistic approach. In these studies, a student rating is defined as the overall perception of the student experience, and the multidimensionality of the construct is ignored in the measurement process (Han, Kiatkawsin, Kim, & Hong, 2018; Zafra-Gómez, Román-Martínez, & Gómez-Miranda, 2015). This simplification provides a fast and frugal way to obtain student feedback, especially when student ratings are not the sole focus of a research study (e.g. Alves & Raposo, 2007; Denson, Loveday, & Dalton, 2010).
Some researchers argue that the single measure fails to acknowledge the multiple attributes of a learning experience and varying degrees of student perceptions about each individual attribute (Elliott & Shin, 2002). Others have shown that the unidimensional evaluation approach can be as informative as the multidimensional method. This is because some dimensions are more central and account for a substantial amount of the variance in student ratings, while the more peripheral dimensions only play a marginal role (Apodaca & Grad, 2005; Cashin & Downey, 1992). Higher student ratings are often associated with good teaching skills (Ginns, Prosser, & Barrie, 2007), sufficient support and advise (Richardson, Slater, & Wilson, 2007), appropriate assessment and workload (James & Casidy, 2018), clear goals, standards, and evaluation criteria (Mikulić, Dužević, & Baković, 2015), and the development of higher order cognitive capabilities (Lizzio, Wilson, & Simons, 2002).
Student rating can be affected by a variety of personal and environmental factors. Examples of personal characteristics and attributes that affect student rating are age (Spooren, 2010), ethnicity (Worthington, 2002), gender (Nasser-Abu Alhija & Fresko, 2018), personal interest (Nasser & Hagtvet, 2006), and attitude towards evaluation (Carlozzi, 2018). At the same time, student rating can be influenced by configurations of the teaching environment, such as course delivery mode (Beran & Violato, 2005), class size (Gannaway, Green, & Mertova, 2018), and teaching experience of the instructor (Berbegal-Mirabent, Mas-Machuca, & Marimon, 2018). Certain student-based and teaching context-based factors, such as academic discipline (Berbegal-Mirabent et al., 2018; Nasser-Abu Alhija & Fresko, 2018) and the instructor’s gender (Berbegal-Mirabent et al., 2018; MacNell, Driscoll, & Hunt, 2015), are not related to criteria of good teaching and often beyond the control of the instructor and the HEI. These factors will contaminate and bias the finding if they are not controlled for (Bedggood & Donovan, 2012). While some scholars attempt to control for these extraneous factors (Denson et al., 2010), critics contend that there are numerous factors which will affect student rating and it is not clear what factors should be considered meaningful indicators or biasing factors (Spooren, Brockx, & Mortelmans, 2013). Given these potential limitations, it is important to consider adopting other forms of student evaluation.
Qualitative responses are often collected alongside quantitative scales to assist with the interpretation of student ratings. However, text-based, free-form student reviews can be viewed as a useful evaluation technique on its own. This is because qualitative feedback supports the development of learner identities and encourages students to become co-producers of knowledge (Erikson, Erikson, & Punzi, 2016), provides insight into teaching and learning issues that are not normally captured by quantitative methods (Stupans, McGuren, & Babey, 2016), and elicits a richer, learner-centred perspective of the student experience (Steyn, Davies, & Sambo, 2019). Research shows that student reviews are positively correlated with quantitative ratings data (Alhija & Fresko, 2009; Braskamp, Ory, & Pieper, 1981; Burdsal & Harrison, 2008; Ory, Braskamp, & Pieper, 1980). To gain a full understanding of learners’ experience, it is critical that student reviews are systematically analysed and interpreted in conjunction with student ratings data, sometimes with the aid of software tools (e.g. Grebennikov & Shah, 2013).

MOOC student ratings and reviews

Student ratings are presented and used in different ways in MOOC research. MOOC research on student ratings can be divided into three categories. The first category includes descriptive studies that explore MOOC learners’ attitudes towards one or more course components. Studies in this category adopt a quantitative approach and conduct descriptive analyses (e.g. mean, standard deviation). For example, Khalil and Ebner (2013) reported that 65% of MOOC learners were satisfied with online interaction. Yousef, Wahid, Chatti, Schroeder, and Wosnitza (2015) found that learners were highly satisfied with the peer assessment module in a MOOC. Studies in this category are sometimes presented in form of case studies and contain qualitative insights that complement quantitative results (e.g. Gutiérrez-Santiuste, Gámiz-Sánchez, & Gutiérrez-Pérez, 2015; Moskal, Thompson, & Futch, 2015). These studies tend to have a narrow focus and rarely investigate the relationship between student ratings and other teaching and learning factors. Nevertheless, these early studies collectively create a snapshot of MOOC learners’ satisfaction with course design features and provide ideas and inspiration for subsequent MOOC research.
The second category treats student ratings as a mediating or outcome variable. Studies in this category are quantitative in nature and apply multivariate data analysis to determine the relationship between student ratings and other learning-related variables. For instance, Joo, So, and Kim (2018) revealed that higher MOOC ratings had a positive influence on learners’ continuance intention to use MOOCs, and that MOOC ratings can be predicted by perceived ease of use and perceived usefulness. These studies tend to have a relatively small sample size and typically focus on individual MOOCs. Therefore, caution should be exercised when generalising findings from these studies to broader learner profiles and MOOC settings. Scholars may need to verify research findings from these studies in alternative teaching and learning environments.
The third category of research adopts qualitative approaches to explore student rating in MOOCs. Compared with MOOC studies in the other two categories, very few scholars have taken a qualitative approach to analysing and interpreting MOOC learners’ ratings and reviews. An exception is a case study of three highly ranked MOOCs, which identified five factors that were important for promoting learner engagement in MOOCs: problem-centric learning, instructor accessibility, active learning, peer interaction, and resources (Hew, 2015). Despite its usefulness, the case study focused exclusively on engagement and explored factors that can potentially contribute to learner engagement in MOOCs. There is a scope for researchers to move beyond ‘learner engagement’ and identify additional themes associated with positive learning experiences in MOOCs. In addition, past research almost exclusively focused on a few hand-picked, highly rated MOOCs. There is scope to incorporate a broader spectrum of MOOCs and a higher volume of review data to enhance transferability of research findings.

Research opportunities

A long history of research in traditional HE settings has established that student ratings provide a valuable data source for optimising the teaching and learning environment for university students (Cohen, 1980). Student ratings and reviews are potentially important information sources for assisting MOOC learners to make enrollment decisions, instructors to revamp MOOCs, university leaders and managers to make investment decisions about the production of future MOOCs, and for helping researchers to better understand what makes MOOCs successful.
Many quantitative analyses of student ratings have tended to adopt a convenience sampling method and have focussed on individual MOOCs or single learner cohorts. These studies are often bounded by specific teaching contexts and may not be generalisable to the broader MOOC setting. On the other hand, qualitative studies are often constrained by the small number of cases analysed and the potential bias of coders and investigators. The current study aims to overcome these limitations by employing a semiautomated text mining solution and evaluating MOOC learners’ positive ratings and reviews for 1,794 MOOCs. This study is guided by two important research questions: (1) what are the key themes contributing to positive learning experiences in MOOCs, and (2) what are the characteristics of each identified key theme? For the purpose of clarity, key themes are defined as themes that most frequently occur in MOOC reviews, and theme characteristics are conceptualised to comprise the theme scope, key connections within the theme, and relationships with other themes.

Research method

Data collection

MOOC ratings and reviews were collected from a third-party review site—Class Central. This website allows MOOC learners to rate courses they have taken and to read other learners’ ratings and reviews. The rating system on Class Central has five categories, anchoring from one star to five stars. MOOC learners are requested to provide written comments when they rate a MOOC, and the comments must be at least 20 words long.
A third-party website was selected over independent MOOC platforms (e.g. Coursera, edX) for three important reasons. First, not all leading MOOC platforms have a review system. Users of FutureLearn and edX, for example, are unable to leave a review because these MOOC platforms do not provide such affordances. Second, the use of different rating systems makes it difficult to compare ratings across MOOCs. The adoption of a five-point rating system on Coursera means that MOOC participants have options to leave neutral responses. In contrast, Open2Study employs a four-point rating system, which means indicating a neutral position is not possible. Third, a third-party website aggregates MOOCs from multiple platforms. The course catalogue of Class Central shows diversity in difficulty, duration, field of study, syllabus, and instructional approaches. Ratings and reviews collected from this website would provide a more diverse range of opinions than a single, independent MOOC platform.
The authors created a programme in Octoparse, a data extraction tool, to randomly collect MOOC learners’ ratings and reviews from Class Central. The data extraction tool helped to transform raw Hypertext Markup Language (HTML) into structured data. Kozinets (2009) indicates that collecting existing data from the Internet does not strictly qualify as human subjects, and informed consent is required only when interventions occur. In this study, all data collected were already publicly available online. No interventions were applied during the data collection process. Therefore, ethics approval was not required. The metadata were stored in an Excel worksheet, documenting the information such as MOOC name, provider (e.g. Edinburgh, Duke), platform (e.g. FutureLearn, Coursera), language of instruction, personal rating (1 to 5), and text-based review. A total of 17,612 entries were collected from learners who participated in 2717 courses delivered by 506 HEIs and organisations. The pool of entries was inspected, and duplicate entries and entries that are not meaningful were deleted. Only entries meeting the predetermined criteria (Table 1) were retained.
Table 1
Selection criteria for entries
 
Selection criteria
Rationale
Cost
Free
The ‘open’ nature of MOOCs implies that the course materials should be freely available to the public
Discipline or field of study
All inclusive
This study is set to investigate positive learning experiences in MOOCs of all disciplines and fields of study
Rating of the learner
4 stars and above
This study defines positive learning experiences as 4 stars and above on a five-star scale
Language of the learner feedback
English
Leximancer cannot process texts in certain languages (e.g. Japanese). The entries written in languages other than English account for about 5% of all entries
Date
Published online from 2013 to 2019
Both old and new entries are useful for understanding the positive learning experience in MOOCs
The selected entries were edited to a standard format in Microsoft Word for further processing. The final text document used for data analysis contains 8475 reviews from individuals who participated in 1794 MOOCs delivered by 382 HEIs and organisations across more than 20 platforms. In the text document, the shortest entry contains 47 characters, and the longest entry contained 12,316 characters. The average length of the selected entries was 389 characters. The total number of characters analysed in this study was 2,745,543. More information about the selected entries is displayed in Table 2.
Table 2
Information about the selected entries
 
Frequency
Percentage
MOOC discipline or field of study
 Art and design
377
4.4
 Business and management
898
10.6
 Computer science
1398
16.5
 Data science and mathematics
750
8.8
 Education
250
2.9
 Engineering
455
5.4
 Health science
736
8.7
 Humanities
1264
14.9
 Professional development
600
7.1
 Science
895
10.6
 Social science
852
10.1
Language of instruction
 English
8058
95.1
 Non-English
417
4.9
MOOC platform
 Coursera
3836
45.3
 edX
1778
21.0
 FutureLearn
1349
15.9
 Other MOOC platforms
1512
17.8
Learner’s MOOC completion status
 Learners who had completed the MOOC
7424
87.6
 Learners who were taking the MOOC
1051
12.4
Learner’s rating of MOOCs
 4 stars
1696
20.0
 5 stars
6779
80.0

Data analysis tool

The authors employed Leximancer Desktop 4.5 to explore and analyse the cleaned data. Leximancer is a data-mining tool which extracts key concepts from collections of textual documents based on the frequency of occurrence of concepts and co-occurrence use, thereby providing impartial opinions on open codes (e.g. themes, categories) and avoiding coder bias (Harwood, Gapp, & Stewart, 2015). Leximancer is widely acknowledged as a useful tool for data mining and has been previously applied to technology-enhanced learning research (e.g. Rodrigues, Almeida, Figueiredo, & Lopes, 2019). In Leximancer, concepts are defined as ‘collections of correlated words that encompass a central theme’ (Leximancer, 2018, p. 65). The extracted information is then visually displayed on an interactive thematic map. The map allows researchers to inspect the conceptual structure of a body of text, to examine the relationship between key concepts, and to explore the association between key concepts and original texts (Leximancer, 2018).
On the thematic map, concepts that co-occur frequently within the text tend to settle near one another and are clustered into themes. Themes are shown as coloured circles, with the most important theme appearing in warm colours such as red and beige. These latent themes are important topics extracted from learners’ reviews and carry the same meaning with the word ‘themes’ in Research Question 1. Therefore, each theme retrieved in Leximancer was inspected and interpreted in detail. The size of a concept dot reflects the connectivity of the concept. The larger the concept dot, the more frequent the concept co-occurred with every other concept on the map. When inspecting the thematic map, four components representing clues for interpretation can help researchers gain new insights: colour of themes, density of concepts within each theme, proximity of concepts and themes, and paths between concepts (Caspersz & Olaru, 2014). These four aspects were given priority in the interpretation of findings. Indicative quotes from MOOC participants were extracted from each theme to enrich the study results.

Processing model

In this study, there are four steps for defining and building a processing model to generate a final project output for analysis (Fig. 1). The Step 1 was to upload the selected entries to Leximancer. An unsupervised machine learning approach was applied to allow key concepts and themes naturally emerge from the data. In other words, the authors instructed Leximancer to automatically identify concept seeds instead of manually providing seed terms. In spite of this, text processing rules and concept seed settings were manually adjusted to produce more relevant and meaningful results. The authors adopted two-sentence text blocks as coding units and paragraphs as context units. The authors also removed seed terms with weak sematic information (e.g. by, thus) from the text data. This configuration was to ensure that high-frequency, low-sematic words would not impact the seed selection and the subsequent machine learning of thesaurus concepts. The authors set the percentage of name-like concepts below 10%, because this study aims discover general concepts rather than specific terms, such as the name of a professor. The authors then proceeded to generate concept seeds.
After a list of concept seeds was extracted by Leximancer (Step 2), the authors inspected the results and made further adjustments to the seeds. As some of the seeds were similar to each other (e.g. topic and topics), the authors modified these seeds by merging two or more similar concepts together (Appendix A). For example, the concept ‘professor’ now represented two seed terms ‘professor’ and ‘teacher’ comprising the initial thesaurus for this concept. In addition, the authors removed three name-like concepts (e.g. Python) that are linked to specific academic disciplines or fields of study. The authors proceeded to generate a thesaurus of terms associated with each concept identified in this phase.
After the thesaurus generation was completed (Step 3), the authors proceeded to configure concepts for displaying on the thematic map. Upon close inspection, all the available concepts are meaningful and interpretable. Therefore, no concepts were suppressed during this process. Leximancer then tagged each text block with the names of the concepts that it contains. The relationships between the surviving concepts were quantified and represented in the thematic map (Step 4). The authors adopted an iterative process of inspecting the results and modifying the processing model, until a meaningful, stable thematic map was produced. At the same time, Leximancer also produced detailed logs for each concept. These logs displayed actual text excerpts with relevant concepts highlighted. These logs provided valuable information that guided the authors through the text. However, Leximancer only provides a starting point to the analysis of a large amount of text data and cannot replace researchers’ judgement, inference, and interpretation (Haynes et al., 2019).

Data analysis

Leximancer was used to uncover important structures, clusters, and patterns in the textual data. The analysis was then interpreted to provide more meaningful insights. Key findings are presented based on the investigation of (1) the proximity of themes, (2) distribution of concepts within themes, and (3) important connections.
The proximity of themes was investigated by visually inspecting the distance between themes on the thematic map. If two themes were spatially close on the map and overlapped with each other (e.g. ‘video’ and ‘interesting’), they would be considered to have a close relationship. If two themes were apart from each other and did not overlap (e.g. ‘questions’ and ‘recommend’), their relationship would be considered as relatively weak.
The distribution of concepts within themes was investigated by inspecting (1) the distance between concepts on the thematic map and (2) text co-occurrence counts of a concept with every other concept within the same theme. This analysis identified the scope and nature of each theme. Leximancer generated a synopsis for each theme containing illustrative examples. Two illustrative examples of excerpts for each theme were incorporated into the presentation of results to help readers better understand each theme.
Research studies adopting Leximancer for text mining normally report the proximity of themes and distribution of concepts within themes (e.g. Penela & Serrasqueiro, 2019). However, in this paper important connections were also explored to provide further insights. Important connections were identified as pairs of concepts that (1) have a high likelihood of co-occurrence (equal or greater than 10%), (2) show close distance on the thematic map, and (3) provide insight into the positive learning experience. If a connection between two concepts did not meet one of these criteria, it was not considered as an important connection. The 10% cut-off point was determined by the researchers upon the inspection of the likelihood occurrence gaps for the concepts in all six themes. A generous cut-off point would yield too many results. Stringent criteria would yield too few pairs of concepts, leaving the researchers little room for interpretation. Table 3 provided examples explaining why some pairs were not qualified as important connections.
Table 3
Examples of pairs of concepts not qualified as important connections
Pairs of concepts
Criteria
High likelihood of co-occurrence (≥ 10%)
Close distance on the thematic map
Provide insight into the positive learning experience
‘interesting’, ‘makes’
Yes, high likelihood (18%)
No, not close
Yes. Although this pair of concepts is a common expression (e.g. ‘makes economics interesting’, ‘makes this course so interesting’), the reasons that make a topic or course interesting is relevant to the content of this study
‘interesting’, ‘useful’
Yes, high likelihood (16%)
Yes, close
No. This pair of concepts is often combined to provide general, positive course evaluation (e.g. ‘a very useful and interesting course’, ‘really interesting and I know that it will be really useful for me in the future’)
‘interesting’, ‘excellent’
No, low likelihood (.9%)
No, not close
No. This pair of concepts is often combined to provide general, positive course evaluation (e.g. ‘an excellent and very interesting course’, ‘the lectures are excellent, informative and interesting’)
To identify important connections, the co-occurrence likelihood data generated by Leximancer were inspected and pairs of concepts showing a low likelihood of co-occurrence (< 10%) were eliminated. Following this, the thematic map was then visually inspected and pairs of concepts that were not close on the thematic map were disregarded. Finally, logs containing text excerpts with pairs of concepts were reviewed to identify eight important connections that provided insight into the positive learning experience in MOOCs. To improve the transparency of the study, all the entries selected for data analysis will be deposited on the journal publisher’s website as a supplementary file for download and inspection. The following section presents the results. The implications of these results are discussed in a separate section.

Results

The analysis of the text data yielded 67 concepts. Appendix B provides insights into the connectivity of these concepts and displays the text co-occurrence counts of each concept with every other concept. The 67 concepts are represented by grey nodes and are grouped into six important themes: ‘learning’, ‘understanding’, ‘interesting’, ‘videos’, ‘recommend’, and ‘questions’ (Fig. 2). Each theme has a cluster of concepts that some have commonality, which can be observed from their proximity on the thematic map. The authors examined each theme in greater detail in the rest of this section.
The latent themes with the highest importance and number of hits are displayed in Table 4. Hits represent the number of text segments in the document associated with a theme and the importance of a theme relative to all other themes. The importance of a theme in the text segments is also indicated by the colour of the circles (Fig. 2). The analysis indicates that ‘learning’, ‘understanding’, ‘interesting’, and ‘videos’ are relatively more important themes, and ‘recommend’ and ‘questions’ are less important ones.
Table 4
Themes in the order of importance
Theme
Significance
Hits
Coloured circle
Learning
1
8004
Red
Understanding
2
7720
Beige
Interesting
3
6553
Dark green
Videos
4
6259
Blue
Recommend
5
5660
Light green
Questions
6
1264
Purple
‘Learning’ was identified as the most important theme, affirming that MOOCs are designed to enable and facilitate learning. This theme was mostly associated with concepts that describe the characteristics of knowledge learned in MOOCs (e.g. ‘basic’, ‘life’). The proximity of ‘learning’ to the theme ‘recommend’ indicates that individuals who reported the accumulation of knowledge in a MOOC were likely to recommend the MOOC. The excerpts below provide two illustrative examples related to this theme:
‘This course gave me a new experience. I had a great time taking it: learning the language formally from the basics and using the simple basics to build on something fun - the projects - they are games I used to play in childhood!’
‘I highly recommend this course…It was fun, it was not time-consuming, yet I still feel as though I learned a lot that will be applicable to my life.’
There are two important connections within the ‘learning’ theme that merit attention: ‘learning-basic’ and ‘learning-life’. The first connection is the learning-basic path. The likelihood of ‘learning’ being linked to ‘basic’ was 24%. This connection indicates that MOOCs afforded people the opportunity to learn and revisit basic knowledge. Specifically, learners who left positive reviews reported that MOOCs had helped them gain the fundamental knowledge in a new subject area: ‘Where I didn't know a word of Spanish before, I was conversing with my colleagues from Mexico after three months of the class!’.
Learners also reported that MOOCs refreshed the basic knowledge learned previously: ‘I'm a pretty bad cook and took it for inspiration in learning to cook but I also found myself interested in relearning some basic chemistry that faded from my memory long ago.’ MOOCs that teach fundamental knowledge were not only useful to beginners but also advanced learners and experienced practitioners: ‘Although I consider myself as an experienced developer, I learned a lot of basic stuff I didn't know and I didn't learn at school’.
The second connection is the learning-life path. The likelihood of co-occurrence between ‘learning’ and ‘life’ is 20%. This connection highlights the importance of reinforcing the relationship between MOOC content and everyday life. Learners who gave positive reviews indicated that the MOOCs they were studying were designed in a way that showed direct and immediate relevance to daily life. For example: ‘The instructors have done a laudable job of teaching how to integrate learning into your everyday life, and the importance of doing so’, and ‘The course drew on examples from real life and provided a basis to apply it in real life’.
Learners also reported positive impacts from studying MOOCs. For example: ‘This course…reduced the amount of stress I was experiencing and confirmed that I should include some form of mindful meditation in my daily life’, and ‘It helped me in my personal, family, and work life to become a more centred person and more capable of thoroughly caring for my patients’.
The second important theme is ‘understanding’. The theme ‘understanding’ contains two layers of meaning: (a) instructors made the course content easier to understand (e.g. ‘understand’, ‘concepts), and (b) learners made efforts to understand the complex, difficult course content (e.g. ‘time’, ‘week’). Two illustrative examples of excerpts connected to this theme are shown below:
‘The book which he provides for free is very useful to understand the concepts. Along with the assignments by Coursera, I also in parallel solved the problem sets from the book which led to a better understanding.’
‘Great course. Was a little more difficult than I was expecting. Had to re-watch videos multiple times to understand certain concepts.’
Two important connections within this theme, ‘understand-concepts’ and ‘time-week’, are explained in greater detail below. The first connection was the understand-concepts path. The likelihood of co-occurrence between ‘understand’ and ‘concepts’ was 22%. Learners highlighted how MOOC instructors have made the course content easier to understand. Examples of the strategies employed by MOOC instructors included ‘explaining concepts in reasonable length articles’, ‘explaining concepts with visual aids’, ‘revisiting concepts repeatedly throughout the course’, ‘providing real-world examples’, ‘providing additional examples and sample responses’, and ‘constantly being tested on concepts’.
The second prominent connection within the theme ‘understanding’ was the time-week path. The likelihood of co-occurrence between ‘time’ and ‘week’ was 17%. This connection shows the importance of investing plenty of time when studying MOOCs: ‘I spent closer to 18 weeks completing, but I took notes, reviewed before starting each section, and spent extra time on any quiz question that was difficult’.
However, the connection ‘time-week’ should not just be taken literally. This connection emphasises the importance of investing mental effort in MOOCs to understand more complex, difficult course content. Examples of strategies used by learners included ‘re-watching videos multiple times’, ‘pausing videos frequently to take longhand notes’, ‘going back to the resources’, and ‘working with difficult concepts often’. The following excerpt illustrates this point: ‘I spent a lot of time each week of the course…the amount of time I spent on a course also includes taking my own notes, making them digitally available online, doing revision to be sure I understood everything, looking up and learning about formulas and the why behind them, etc.’.
The third important theme was ‘interesting’. This theme was associated with concepts indicating positive affective responses towards a MOOC (e.g. ‘interesting’, ‘engaging’) and the sources of these responses (e.g. ‘professor’, ‘content’). The proximity of ‘interesting’ to the theme ‘recommend’ suggests that learners were likely to recommend a MOOC to other people if they perceived the MOOC to be interesting. Two illustrative examples of excerpts relevant to this theme are displayed below:
‘I enjoyed everything about this class. The professor made the subject fun to learn with wit and humor and always brought up interesting case studies.’
‘I enjoyed taking this class and looked forward to completing the exercises at the end of each chapter. I thought Dr. Chuck was a great teacher, and I found his personality really fun and engagingly silly, which I personally enjoy.’
The interesting-professor path within this theme is an important connection. The likelihood of co-occurrence between ‘interesting’ and ‘professor’ is 14%. This connection illustrates how instructors made a MOOC more interesting. Learners who left positive reviews highlighted the central role instructors played in making MOOCs interesting: ‘The teacher was very thorough in his explanations and made many good analogies and was really pleasant to listen to. He really made the course a lot more fun to follow than I had originally expected’.
Learners commented that MOOCs were entertaining because instructors designed the course content and materials to be interesting. Course content and materials frequently mentioned by participants include but are not limited to ‘choice of topics’, ‘examples’, ‘case studies’, ‘lecture videos’, ‘dialogues between the professor and students in the videos’, ‘assignments’, ‘games’, and ‘supplementary information and links’.
In addition, learners responded positively to communication styles and techniques adopted by MOOC instructors to add interest to the content using expressions such as ‘humorous’, ‘energetic’, ‘enthusiastic’, and ‘passionate’. The following excerpts illustrate how the use of humour made MOOCs interesting: ‘Professor Bates's occasional humour really helps liven up the dull, and sometimes dry material’, and ‘His quirky sense of humour and his abstractionist drawings often made me chuckle and refreshed my attention’.
‘Videos’ were identified as a fourth important theme. This theme incorporates concepts that describe the quality of video lectures (e.g. ‘clear’, ‘easy’) and course components relevant to video lectures (e.g. ‘reading’, ‘examples’). The proximity of the ‘videos’ theme to ‘understanding’, ‘interesting’, and ‘questions’ suggests that videos played an important role in helping learners understand difficult concepts, making MOOCs interesting, and addressing learners’ questions. Two illustrative examples of excerpts relevant to this theme are shown below:
‘Absolutely amazing course. The teachers are clear and easy to understand, the videos are high quality and contain great (and very useful) visual material to supplement the content.’
‘I appreciate the “flipped learning” style of the course, especially as the lecture videos are well filmed and content is explained clearly.’
Within this theme, the videos-reading path stood out as an important connection. The relative distance between videos and reading was close and the likelihood of ‘reading’ being linked to ‘videos’ is 31%. This connection shows that video lectures and reading materials were valuable resources and often complemented each other to create a positive learning experience in MOOCs: ‘It progressively gets more difficult but allowed a lot of reading and watching videos around the subject which creates a sense of becoming an expert at it’, and ‘There is no fluff here. Every video and each reading add something of interest and value to the topics’.
Learners commented on how videos complemented and clarified the reading materials, and vice versa. Some learners also commented that the videos provided a useful summary of the reading materials: ‘The lectures are a re-cap of the readings, so serve as a kind of curation and reminder and helped to cement the learning’, and ‘I frequently felt lost when reading the articles, but the videos helped to some degree.’
The fifth important theme was ‘recommend’. This theme incorporates concepts indicating that learners would recommend a MOOC to other people (e.g. ‘recommend’, ‘courses’). The agglomeration of concepts in the ‘recommend’ theme was lower than for ‘learning’, ‘understanding’, ‘interest’, and ‘videos’, indicating that ‘recommend’ was less complex and multifaceted than the four previously discussed themes. Two illustrative examples of excerpts related to this theme are shown below:
‘Great course and great teacher! I didn't know there were so many distinct perspectives on justice and the good life. I highly recommend it!’
‘The course is well structured; the videos are very good. I can really recommend this course.’
An important connection within this theme was the ‘recommend-anyone’ path. The proximity of these two concepts and the 30% likelihood of co-occurrence indicate a strong relationship in the text document. This connection explains why learners recommended a MOOC. Learners who recommended a MOOC often described their overall learning experiences as ‘good’, ‘great’, ‘excellent’, ‘outstanding’, and ‘perfect’.
Learners also provided specific, detailed reasons when recommending a MOOC. Some learners recommended a MOOC because the MOOC was ‘educational’, ‘informative’ and ‘insightful’. This type of recommendation often came from the learners who reported that they had accumulated new knowledge, mastered new skills, or gained new perspectives: ‘This course is very insightful…I practically learned to manage myself, set goals and follow them…I recommend every smart person to take it.’
Other learners who recommended a MOOC because the MOOC was well-organised, or the selection of topics and materials in the MOOC is appropriate in scope. These learners described MOOCs as ‘structured’, ‘well-constructed’, ‘well-put together’, ‘comprehensive’, and ‘wide-ranging’. The following excerpt illustrates this point: ‘This course was really well-structured, with theory immediately supported by examples and personal witness reports. It tackled broad ideas with understandable examples drawn from world-wide situations…Without doubt the best MOOC, possibly the best course I have ever done.’
In addition, some learners recommended a MOOC because communities in the MOOC were described as ‘supportive’ and ‘helpful’: ‘Discussion groups were helpful and supportive with interaction between students and lecturers… I thoroughly enjoyed this course and would recommend it to anyone interested in permaculture.’ Another learner wrote: ‘I was greatly aided by the comments section in the course where you could bounce ideas off each other. I was honoured to exchange ideas with other much more experienced and helpful online students which made the whole experience really uplifting. I highly recommend this course to the layman and professional alike.’
The sixth theme was labelled ‘questions’. This theme incorporated a concept that indicates getting answers to questions (i.e. ‘questions’). The agglomeration of concepts in ‘questions’ is lower compared to the previously discussed themes, indicating that ‘questions’ represents a relatively simple structure. Two illustrative examples of excerpts pertinent to this theme are provided below:
‘The course is tough, but the instructor explains everything very clearly, and the course staff provides stellar help with questions on the discussion forums.’
‘I think taking questions and putting up videos of answers to questions from learners was an excellent idea and a useful learning tool.’
Within this theme, the questions-discussion path is an important connection. The likelihood of co-occurrence between ‘questions’ and ‘discussion’ is 11%, which is higher than the likelihood of the concept ‘questions’ being linked to every other concept on the thematic map. This connection shows that discussion boards are an important space to ask questions and get help from teaching staff: ‘The mentors…answered as many questions as they were able in the discussion forum and their comments were significant to the course.’
Learners also highlighted that teaching staff addressed their questions in a timely manner: ‘Questions on the discussion board are answered very quickly’, and ‘The educators…were ready to take part in the forum with further explanation and responded promptly to correct occasional inaccuracies.’
In summary, Table 5 displays the six important themes expressed by MOOC learners in their positive ratings and reviews. The table also highlights the characteristics of each theme, including the theme scope, key connections within the theme, and relationships with other themes.
Table 5
Six key themes and their characteristics
Key Themes
Characteristics of each theme
Theme 1: Learning
Theme scope
‘Learning’ incorporated the characteristics of knowledge learned in MOOCs (e.g. ‘basic’, ‘life’)
Important connections
‘Learning-basic’ (24% co-occurrence likelihood): MOOCs afforded people the opportunity to learn the basic knowledge and revisit the basic knowledge they learned previously
‘Learning-life’ (20% co-occurrence likelihood): MOOCs were designed in a way that showed direct and immediate relevance for life
Relationships with other themes
‘Learning’ is in proximity with the theme ‘recommend’
Theme 2: Understanding
Theme scope
‘Understanding’ incorporated strategies adopted to make course concepts easier to understand (e.g. ‘understand’, ‘concepts’) and mental efforts invested to understand the complex, difficult course content (e.g. ‘time’, ‘week’)
Important connections
‘Understand-concepts’ (22% co-occurrence likelihood): instructors made concepts more understandable and less abstract, and learners made an effort to understand complex, difficult concepts
‘Time-week’ (17% co-occurrence likelihood): learners emphasised the importance of investing time and efforts when studying MOOCs
Relationships with other themes
‘Understanding’ is in proximity with the themes ‘learning’ and ‘videos’
Theme 3: Interesting
Theme scope
‘Interesting’ incorporated learners’ emotional reactions towards a MOOC (e.g. ‘interesting’, ‘engaging’) and the source that elicits positive emotions from learners (e.g. ‘professor’, ‘content’)
Important connections
‘Interesting-professor’ (14% co-occurrence likelihood): instructors played a central role in making MOOCs interesting
Relationships with other themes
‘Interesting’ is in proximity with the theme ‘videos’
Theme 4: Videos
Theme scope
‘Videos’ incorporated the quality of video lectures (e.g. ‘clear’, ‘easy’) and course components relevant to video lectures (e.g. ‘reading’, ‘examples’)
Important connections
‘Videos-reading’ (31% co-occurrence likelihood): video lectures and reading materials were valuable resources and often complemented each other to provide a positive learning experience in MOOCs
Relationships with other themes
‘Videos’ is in proximity with the themes ‘understanding’, ‘interesting’, and ‘questions’
Theme 5: Recommend
Theme scope
‘Recommend’ incorporated learners’ intentions to recommend a MOOC to other people (e.g. ‘recommend’, ‘courses’)
Important connections
‘Recommend-anyone’ (30% co-occurrence likelihood): the connection explains why learners recommended a MOOC
Relationships with other themes
‘Recommend’ is in proximity with the theme ‘learning’
Theme 6: Questions
Theme scope
‘Questions’ incorporated getting answers to questions (i.e. ‘questions’)
Important connections
‘Questions-discussion’ (11% co-occurrence likelihood): discussion boards are an important space to ask questions and get help from teaching staff
Relationships with other themes
‘Questions’ is in proximity with the theme ‘videos’

Discussion

This study has two research objectives. First, the study aims to identify the key themes contributing to the positive learning experience in MOOCs. Second, the study aims to investigate the characteristics of each identified theme. To achieve the research objectives, this study employed a semiautomatic text mining solution and processed 8,475 text segments submitted by learners who participated in 1,794 courses delivered by 382 HEIs and organisations across more than 20 MOOC platforms. Major findings revealed in this study are reported in this section and discussed in conjunction with the MOOC literature and the broader education literature. This study moves the field forward by making five important contributions.
Firstly, this study identified six important, frequently occurring themes related to the positive learning experience in MOOCs. The themes provide new insights into how the learning experience in MOOCs can be enhanced by addressing these six areas. ‘Learning’ emerged as the most salient theme. A closer inspection of the theme ‘learning’ shows that learners placed high importance on making connections between MOOC content and their everyday lives. In view of this, it is recommended that MOOC content should focus on more realistic learning contexts, and instructional conditions in MOOCs should be engineered to facilitate the acquisition of knowledge that can be readily transferred to real-world practices. For example, MOOC instructors could explicitly explain in videos how theories and findings of scientific research are relevant and can be directly applied to daily routine. This finding substantiates Kovanović et al.’s (2018) observation that the ability to describe applications of knowledge and apply learning in real settings is a critical dimension of the online learning experience. The findings also aligns with the pedagogical approach of authentic learning, which situates learning activities in real-world contexts and provides learning opportunities by ‘allowing students to experience the same challenges in the curriculum as they do in their daily endeavors’ (Herrington, Reeves, & Oliver, 2014, p. 402). Authentic learning has been operationalised in the higher education context (e.g. Herrington & Herrington, 2006) but its application has not been well researched in the MOOC context at this point. An important research avenue is to identify and understand the strategies instructors use to make the MOOC content more relevant to learners’ everyday lives. Future research could also adopt a grounded theory approach to provide insight into how MOOC learning can positively affect people’s life. This line of research is important as the learning-life path emerged as a distinct, meaningful connection within the theme ‘learning’, which is one of the core themes contributing to the positive learning experience in MOOCs.
Secondly, this study identified that ‘understanding’ was a prominent theme linked to positive learning experiences in MOOCs. The analysis showed that instructors implemented a variety of strategies to assist MOOC learners with comprehension. Useful strategies reported by MOOC learners included providing short, concise supplementary materials, explaining concepts with visual aids and real-world examples, and repeatedly testing learners on concepts throughout a course. This study provides empirical evidence that improved comprehension is an important contributor to positive learning experiences in MOOCs. This is consistent with the notion that an individual’s appraisal of an event’s comprehensibility elicits interest (Silvia, 2008), and that interest in turn predicts the quality of learning experience (Schiefele & Csikszentmihalyi, 1994). As MOOCs evolve, researchers have proposed several guidelines for designing MOOCs. These guidelines are useful sources for reflecting on the complexity of the MOOC design. However, many of these design principles are based on descriptive and conceptual research (e.g. Spyropoulou, Pierrakeas, & Kameas, 2014) or personal experiences and opinions (e.g. Manallack & Yuriev, 2016). Future research could establish a clear relationship between specific MOOC design principles (e.g. assessments, pedagogies, technologies) and learner comprehension. This is a promising research direction as the current study empirically reveals that learner comprehension is an important contributing factor to the positive learning experience in MOOCs.
The ‘understanding’ theme also had a second layer of meaning that merits further discussion. The analysis demonstrated that learners exerted mental effort to comprehend complex, difficult course content. This included strategies such as re-watching the same video lecture and spending extra time to study MOOCs. This finding confirms the importance of the presence of cognitive engagement in the MOOC learning process. Cognitive engagement is interpreted as individuals’ mental investment in learning to comprehend complex ideas, master difficult skills, and strengthen performance (Deng et al., 2020a). The measurement of cognitive engagement in MOOCs is achieved through either platform-generated log files (Li & Baker, 2018) or self-administrated surveys (Deng, Benckendorff, & Gannaway, 2020b). Li and Baker (2018), for example, found that the learner cohort with higher levels of cognitive engagement achieved higher grades in a MOOC, particularly for individuals who followed the learning path intended by instructors. More recently, Deng et al. (2020b) discovered that learners with higher levels of cognitive engagement were more likely to have completed a MOOC and earned a certificate. These empirical studies show that higher levels of cognitive engagement are associated with favourable MOOC learning outcomes, such as better academic performance and higher completion rates. The results of this study add to past findings that cognitive engagement is also associated with learners’ perceived quality of instruction in MOOCs. An important research direction is to explore learning designs (e.g. educational multimedia design, design of assessments) that bolster cognitive engagement without creating extraneous cognitive load for MOOC participants.
Thirdly, this study identified that ‘interesting’ is a prominent theme linked to positive learning experiences in MOOCs. This finding draws attention to emotional engagement in MOOCs. Deng et al. (2020a) conceptualised emotional engagement as individuals’ affective reactions to peers, instructors, and course content. Empirical research reveals that interest is a key affective component that underlies positive engagement within MOOCs (Deng et al., 2020a). In the MOOC literature, emotional engagement is often investigated by analysing participation in discussion boards (e.g. Comer, Baker, & Wang, 2015). However, only a small percentage of MOOC learners choose to participate in forum discussions. Emotional engagement of the individuals who do not post on MOOC discussion boards should not be overlooked or misrepresented. This study adds to the discussion about emotional engagement beyond MOOC discussion boards. The findings showed that instructors played a central role in making MOOCs interesting to learn through (1) designing the course content and materials to be interesting and (2) employing communication styles and techniques to add interest. A promising avenue for future research is to explore instructional conditions that trigger and maintain MOOC learners’ interest. MOOC platforms can be designed to capture learners’ emotions at different points in time and store the longitudinal trajectories in databases for further analysis. This data would also provide valuable empirical evidence for developing just-in-time support and scaffolding strategies for potential at-risk learners and emotional disengagers.
Fourthly, this study identified that video lectures played a significant part in positive MOOC learning experiences. In this study, ‘videos’ is the only theme in the thematic map that represents instructional strategies. Other instructional strategies, such as reading and discussion, appeared as concepts in the thematic map but did not evolve into bigger themes. This finding confirms other observations that videos were perceived to be an important instructional tool for learning in MOOCs (Stöhr, Stathakarou, Mueller, Nifakos, & McGrath, 2019; Walji, Deacon, Small, & Czerniewicz, 2016; Watson, Kim, & Watson, 2016). A recent study showed that video lectures had more mouse click events than any other learning opportunity on MOOC platforms (Hu, Zhang, Gao, & Wang, 2020). The importance of video lectures is also indicated by the proximity of the theme ‘video’ to three other themes—‘understanding’, ‘interesting’, and ‘questions’. The proximity of ‘videos’ to ‘understanding’ suggests that video lectures enhanced MOOC learners’ understanding of the course. This result is consistent with the findings of the broader education research showing that educational videos are effective in helping students understand difficult concepts when compared with a course session where the videos were not used (Zubair & Laibinis, 2015).
The proximity of ‘videos’ to ‘interesting’ suggests that video lectures can amplify emotional engagement in MOOCs. This result substantiates findings of previous studies. Chen, Wong, Teh, and Chuah (2017) shows that five types of MOOC videos were able to induce positive emotions—picture in picture, text overlay, Khan-style tablet capture, screencast, and animation. However, the study undertaken by Chen et al. (2017) had a relatively small sample size (n = 50) and did not consider learner-based factors (e.g. prior MOOC experience) and teaching context-based factors (e.g. disciplinary focus). Failure to control for these factors may misrepresent the effectiveness of certain video types and lead to flawed assumptions about the use of MOOC videos. Future research should take into consideration personal and environmental factors when investigating the impact of video and multimedia design on learners’ emotional engagement. Furthermore, emotional engagement in MOOCs is often captured by log files (Deng & Benckendorff, 2017) and self-reported measures such as surveys (Deng et al., 2020b). Physiological measures have the advantage of capturing emotional engagement at the activity level without having to disrupt students from learning (Henrie, Halverson, & Graham, 2015). An important research avenue is to analyse biometrics such as galvanic skin response, heart rate, and facial emotion (e.g. Ninaus et al., 2019). The results can be used to guide the design of MOOC videos that elicit positive emotions from learners.
Lastly, this study identified that providing a social space for people to ask questions and seek support contributed to positive learning experiences in MOOCs. In this study, the theme ‘questions’ is relevant to social engagement, which is used to describe learner-instructor and learner-learner interactions (Deng et al., 2020a). Past MOOC research shows that responding to other learners’ questions and contributing to course discussions are indicative of social engagement (Deng et al., 2020a), and the presence of teaching staff in discussion boards facilitates social engagement (Poquet, Jovanovic, & Dawson, 2020). The majority of MOOCs follow a cognitive-behaviourist pedagogical approach (Deng et al., 2020b), which has been criticised for limited social engagement and an excessive focus on content delivery (Agonács & Matos, 2017). As MOOC learners tend to be well-educated and highly motivated (Deng et al., 2019), some scholars claim that teaching and learning in MOOCs needs to shift from instructor-directed pedagogy to self-directed andragogy or the self-determined heutagogy. Ntourmas, Avouris, Daskalaki, and Dimitriadis (2019), for example, claimed that TAs acted more as ‘omniscient interlocutors’ (Ntourmas et al., 2019, p. 247), and teaching staff providing direct, instant, and comprehensive answers did not promote problem-based learning. Connected to this point, the boarder education literature also shows that two-way, dialogic interactions were more effective than unilateral instructor feedback for promoting high-order learning outcomes (Tan, Whipp, Gagné, & Van Quaquebeke, 2019).
This study provides empirical evidence that MOOC learners valued the answers provided by teaching staff (e.g. TAs, forum moderators) and the immediacy of such support. In other words, learners seemed to be comfortable with taking a consumptive role in forum communication and perceived the teaching staff as the primary source of knowledge and information. This observation leads to a series of important questions. Should MOOCs provide a more flexible structure to promote self-directedness in learning? If the goal is set to advance self-determined learning and achieve higher levels of learner agency, how can instructors reconfigure the teaching and learning space to achieve this goal? Kizilcec, Pérez-Sanagustín, and Maldonado (2016) have shown in their experimental study that simply embedding self-regulated learning study tips in MOOCs did not improve persistence. Moreover, most teaching staff moderate the MOOC discussion boards on a pro bono basis. Although some people ‘felt uneasy about making postings that might seem too…directive’ (Beaven, Hauck, Comas-Quinn, Lewis, & de los Arcos, 2014, p. 35) and ‘[tried] to help learners reach the solution themselves’ (Ntourmas et al., 2019, p. 245), it is not clear whether all teaching staff have the capacity (e.g. knowledge, time) to scaffold the learning process and guide a massive number of learners along a path of inquiry. Future research should establish a working set of best practices for cultivating self-directed and self-determined skills in MOOCs.

Conclusions

This study has made two important theoretical contributions to the literature. Firstly, this study demonstrated the importance of MOOC learners in co-creating a positive learning experience. Past learning research typically conceives desirable learning outcomes in MOOCs as achieving higher grades, persisting longer, or obtaining a certificate of accomplishment. This study argues that there are underlying differences between MOOCs and credit-bearing university courses, and learning outcomes valued in traditional HEIs (e.g. achievement, persistence) may not be the best indicators to represent MOOC learning outcomes. It is argued that perceived quality of instruction is an important alternative outcome indicator in MOOCs, and shows that the key themes contributing to positive learning experiences can be identified through an analysis of learners’ ratings and reviews. Secondly, this study identified six important themes associated with the positive learning experience in MOOCs. The identified themes are ‘learning’, ‘understanding’, ‘interesting’, ‘videos’, ‘recommend’, and ‘questions’. The study reported the characteristics of each identified theme based on the investigation of the proximity of themes, distribution of concepts within themes, and important connections. The study discussed the findings in conjunction with prior theories and pedagogical practices to provide new insights into the conceptualisation of the positive learning experience in MOOCs. A number of future research avenues have been proposed based on the discussion of research findings and the existing literature.
A number of practical recommendations are evident for MOOC designers, administrators, and policy makers. The study demonstrates that positive MOOC learning experiences are complex and multifaceted. Practitioners should take into account the prominent themes identified in this study and the characteristics of each identified theme when designing or revamping a MOOC. Based on research findings, this study presents the following propositions for promoting the learning experience in MOOCs: (1) provide realistic learning contexts and instructional conditions in MOOCs to facilitate the acquisition of knowledge that transfers more readily to real-world practices; (2) carefully design the instructional conditions so that some mental challenge and stimulation is required for learners to achieve a full understanding of the content, rather than making MOOCs too simple or effortless to complete; (3) design the course content, materials, and communications to generate interest; (4) allocate sufficient resources to create high-quality video lectures; (5) employ video lectures to elicit positive emotions from MOOC learners and simplify complex, difficult concepts; and (6) incorporate discussion boards in MOOCs and invest in human and digital resources to address learners’ queries.

Limitations

Three potential limitations should be kept in mind when discussing the research findings. First, the importance of a theme is indicated by the connectedness of concepts within that theme. Some researchers may disagree about how ‘importance’ is defined in this study, and there are definitely alternative ways to conceptualise ‘importance’. It is possible that some less connected concepts and themes will also provide clues to construction of the positive learning experience in MOOCs. Future research could employ a more qualitative approach to identify other latent themes that may contribute to this topic of interest.
Second, non-English reviews and negative were excluded in this study. An analysis of non-English reviews may provide additional insight into the learning experience idiosyncratic to these learners (e.g. individuals who use non-English MOOC platforms, individuals whose primary language is not English). Future research could overcome this limitation by analysing non-English reviews to provide a more comprehensive understanding of the positive learning experience in MOOCs. While the scope of this study was focussed on positive reviews, it would also be equally interesting to conduct a similar analysis of negative reviews to identify attributes that cause dissatisfaction amongst learners.
Third, learner characteristics and environmental factors were not controlled in this study. It is possible that learners belonging to different cohorts (e.g. ethnic groups, generation cohorts) can perceive the positive learning experience differently. The control of disciplines may also provide additional insight into the design of pedagogy and assessments specifically related to MOOCs in a given discipline. It is also possible that individuals enrolled in social science MOOCs can perceive the positive learning experience differently than people studying STEM MOOCs. Future research could control these factors to provide a more nuanced understanding of the positive MOOC learning experience.

Supplementary Information

The online version contains supplementary material available at https://​doi.​org/​10.​1186/​s41239-021-00244-3.

Acknowledgements

The authors would like to thank the anonymous reviewers for their constructive feedback.

Competing interests

The authors declare that they have no competing interests.
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix

Appendix A

See Table 6
Table 6
Retained and merged concepts
Retained concepts
Merged concepts
courses
Course/courses/MOOC/MOOCs
learning
Learn/learned/learning
makes
Makes/making
students
Student/students
professor
Professor/teacher
take
Take/taken/taking
topics
Topic/topics
understand
Understand/understanding
use
Use/used/using
videos
Video/lectures/videos
week
Week/weeks
work
Work/working
interesting
Fun/interesting/interested

Appendix B

See Table 7
Table 7
Text co-occurrence counts of a concept with every other concept on the thematic map
Themes
Concepts
Co-occurrence counts
Theme1: learning
Learning
3125
Take
1782
Work
1025
Knowledge
883
People
757
Experience
691
Life
591
Better
574
Things
538
World
497
Online
468
Basic
467
Job
353
Data
346
Real
342
Research
330
Science
317
Doing
294
Team
280
Theme 2: understanding
Understand
1556
Time
1227
Use
1177
Students
1033
Week
810
Concepts
771
Assignments
610
Makes
561
Different
518
Need
448
Feel
393
Language
377
Helped
363
Problems
302
Simple
256
English
171
Theme 3: interesting
Interesting
1782
Excellent
1099
Topics
915
Professor
739
Useful
734
Information
709
Enjoyed
680
Content
618
Subject
611
Introduction
605
Level
397
Engaging
394
Thought
385
Teaching
363
Important
347
Background
343
Theme 4: videos
Videos
1890
Easy
903
Material
887
Clear
559
Examples
502
Presented
445
Reading
349
Discussion
307
Theme 5: recommend
Courses
1708
Recommend
1614
Class
1055
Best
772
Anyone
580
Interested
548
Approach
255
Theme 6: questions
Questions
424

Supplementary Information

Literature
go back to reference Agonács, N., & Matos, J. F. (2017). Towards a heutagogy-based MOOC design framework. Madrid: Paper presented at the European MOOCs Stakeholders Summit. Agonács, N., & Matos, J. F. (2017). Towards a heutagogy-based MOOC design framework. Madrid: Paper presented at the European MOOCs Stakeholders Summit.
go back to reference Alhija, F.N.-A., & Fresko, B. (2009). Student evaluation of instruction: What can be learned from students’ written comments? Studies in Educational Evaluation, 35(1), 37–44.CrossRef Alhija, F.N.-A., & Fresko, B. (2009). Student evaluation of instruction: What can be learned from students’ written comments? Studies in Educational Evaluation, 35(1), 37–44.CrossRef
go back to reference Alves, H., & Raposo, M. (2007). Conceptual model of student satisfaction in higher education. Total Quality Management & Business Excellence, 18(5), 571–588.CrossRef Alves, H., & Raposo, M. (2007). Conceptual model of student satisfaction in higher education. Total Quality Management & Business Excellence, 18(5), 571–588.CrossRef
go back to reference Apodaca, P., & Grad, H. (2005). The dimensionality of student ratings of teaching: integration of uni- and multidimensional models. Studies in Higher Education, 30(6), 723–748.CrossRef Apodaca, P., & Grad, H. (2005). The dimensionality of student ratings of teaching: integration of uni- and multidimensional models. Studies in Higher Education, 30(6), 723–748.CrossRef
go back to reference Authors. (2017). To be added following double-blind peer review. Authors. (2017). To be added following double-blind peer review.
go back to reference Authors. (2020). To be added following double-blind peer review. Authors. (2020). To be added following double-blind peer review.
go back to reference Beaven, T., Hauck, M., Comas-Quinn, A., Lewis, T., & de los Arcos, B. . (2014). MOOCs: striking the right balance between facilitation and self-determination. MERLOT Journal of Online Learning and Teaching, 10(1), 31–43. Beaven, T., Hauck, M., Comas-Quinn, A., Lewis, T., & de los Arcos, B. . (2014). MOOCs: striking the right balance between facilitation and self-determination. MERLOT Journal of Online Learning and Teaching, 10(1), 31–43.
go back to reference Bedggood, R. E., & Donovan, J. D. (2012). University performance evaluations: What are we really measuring? Studies in Higher Education, 37(7), 825–842.CrossRef Bedggood, R. E., & Donovan, J. D. (2012). University performance evaluations: What are we really measuring? Studies in Higher Education, 37(7), 825–842.CrossRef
go back to reference Beran, T., & Violato, C. (2005). Ratings of university teacher instruction: How much do student and course characteristics really matter? Assessment & Evaluation in Higher Education, 30(6), 593–601.CrossRef Beran, T., & Violato, C. (2005). Ratings of university teacher instruction: How much do student and course characteristics really matter? Assessment & Evaluation in Higher Education, 30(6), 593–601.CrossRef
go back to reference Berbegal-Mirabent, J., Mas-Machuca, M., & Marimon, F. (2018). Is research mediating the relationship between teaching experience and student satisfaction? Studies in Higher Education, 43(6), 973–988.CrossRef Berbegal-Mirabent, J., Mas-Machuca, M., & Marimon, F. (2018). Is research mediating the relationship between teaching experience and student satisfaction? Studies in Higher Education, 43(6), 973–988.CrossRef
go back to reference Betz, E. L., Klingensmith, J. E., & Menne, J. W. (1970). The measurement and analysis of college student satisfaction. Measurement and Evaluation in Guidance, 3(2), 110–118.CrossRef Betz, E. L., Klingensmith, J. E., & Menne, J. W. (1970). The measurement and analysis of college student satisfaction. Measurement and Evaluation in Guidance, 3(2), 110–118.CrossRef
go back to reference Bovill, C. (2020). Co-creation in learning and teaching: the case for a whole-class approach in higher education. Higher Education, 79(6), 1023–1037.CrossRef Bovill, C. (2020). Co-creation in learning and teaching: the case for a whole-class approach in higher education. Higher Education, 79(6), 1023–1037.CrossRef
go back to reference Bovill, C., Cook-Sather, A., & Felten, P. (2011). Students as co-creators of teaching approaches, course design, and curricula: implications for academic developers. International Journal for Academic Development, 16(2), 133–145.CrossRef Bovill, C., Cook-Sather, A., & Felten, P. (2011). Students as co-creators of teaching approaches, course design, and curricula: implications for academic developers. International Journal for Academic Development, 16(2), 133–145.CrossRef
go back to reference Braskamp, L. A., Ory, J. C., & Pieper, D. M. (1981). Student written comments: Dimensions of instructional quality. Journal of Educational Psychology, 73(1), 65–70.CrossRef Braskamp, L. A., Ory, J. C., & Pieper, D. M. (1981). Student written comments: Dimensions of instructional quality. Journal of Educational Psychology, 73(1), 65–70.CrossRef
go back to reference Bunce, L., Baird, A., & Jones, S. E. (2017). The student-as-consumer approach in higher education and its effects on academic performance. Studies in Higher Education, 42(11), 1958–1978.CrossRef Bunce, L., Baird, A., & Jones, S. E. (2017). The student-as-consumer approach in higher education and its effects on academic performance. Studies in Higher Education, 42(11), 1958–1978.CrossRef
go back to reference Burdsal, C. A., & Harrison, P. D. (2008). Further evidence supporting the validity of both a multidimensional profile and an overall evaluation of teaching effectiveness. Assessment & Evaluation in Higher Education, 33(5), 567–576.CrossRef Burdsal, C. A., & Harrison, P. D. (2008). Further evidence supporting the validity of both a multidimensional profile and an overall evaluation of teaching effectiveness. Assessment & Evaluation in Higher Education, 33(5), 567–576.CrossRef
go back to reference Carlozzi, M. (2018). Rate my attitude: research agendas and RateMyProfessor scores. Assessment & Evaluation in Higher Education, 43(3), 359–368.CrossRef Carlozzi, M. (2018). Rate my attitude: research agendas and RateMyProfessor scores. Assessment & Evaluation in Higher Education, 43(3), 359–368.CrossRef
go back to reference Cashin, W. E., & Downey, R. G. (1992). Using global student rating items for summative evaluation. Journal of Educational Psychology, 84(4), 563–572.CrossRef Cashin, W. E., & Downey, R. G. (1992). Using global student rating items for summative evaluation. Journal of Educational Psychology, 84(4), 563–572.CrossRef
go back to reference Caspersz, D., & Olaru, D. (2014). Developing ‘emancipatory interest’: Learning to create social change. Higher Education Research & Development, 33(2), 226–241.CrossRef Caspersz, D., & Olaru, D. (2014). Developing ‘emancipatory interest’: Learning to create social change. Higher Education Research & Development, 33(2), 226–241.CrossRef
go back to reference Chen, C. J., Wong, V. S., Teh, C. S., & Chuah, K. M. (2017). MOOC videos-derived emotions. Journal of Telecommunication, Electronic and Computer Engineering, 9(2–9), 137–140. Chen, C. J., Wong, V. S., Teh, C. S., & Chuah, K. M. (2017). MOOC videos-derived emotions. Journal of Telecommunication, Electronic and Computer Engineering, 9(2–9), 137–140.
go back to reference Chen, Y., & Hoshower, L. B. (2003). Student evaluation of teaching effectiveness: An assessment of student perception and motivation. Assessment & Evaluation in Higher Education, 28(1), 71–88.CrossRef Chen, Y., & Hoshower, L. B. (2003). Student evaluation of teaching effectiveness: An assessment of student perception and motivation. Assessment & Evaluation in Higher Education, 28(1), 71–88.CrossRef
go back to reference Clayson, D. E., & Haley, D. A. (2005). Marketing models in education: Students as customers, products, or partners. Marketing Education Review, 15(1), 1–10.CrossRef Clayson, D. E., & Haley, D. A. (2005). Marketing models in education: Students as customers, products, or partners. Marketing Education Review, 15(1), 1–10.CrossRef
go back to reference Cohen, P. A. (1980). Effectiveness of student-rating feedback for improving college instruction: A meta-analysis of findings. Research in Higher Education, 13(4), 321–341.CrossRef Cohen, P. A. (1980). Effectiveness of student-rating feedback for improving college instruction: A meta-analysis of findings. Research in Higher Education, 13(4), 321–341.CrossRef
go back to reference Comer, D. K., Baker, R., & Wang, Y. (2015). Negativity in Massive Online Open Courses: Impacts on learning and teaching and how instructional teams may be able to address it. InSight: A Journal of Scholarly Teaching, 10, 92–113.CrossRef Comer, D. K., Baker, R., & Wang, Y. (2015). Negativity in Massive Online Open Courses: Impacts on learning and teaching and how instructional teams may be able to address it. InSight: A Journal of Scholarly Teaching, 10, 92–113.CrossRef
go back to reference Deng, R., & Benckendorff, P. (2017). A contemporary review of research methods adopted to understand students' and instructors' use of Massive Open Online Courses (MOOCs). International Journal of Information and Education Technology, 7(8), 601–607.CrossRef Deng, R., & Benckendorff, P. (2017). A contemporary review of research methods adopted to understand students' and instructors' use of Massive Open Online Courses (MOOCs). International Journal of Information and Education Technology, 7(8), 601–607.CrossRef
go back to reference Deng, R., Benckendorff, P., & Gannaway, D. (2017). Understanding learning and teaching in MOOCs from the perspectives of students and instructors: A review of literature from 2014 to 2016. Lecture Notes in Computer Science, 10254, 176–181.CrossRef Deng, R., Benckendorff, P., & Gannaway, D. (2017). Understanding learning and teaching in MOOCs from the perspectives of students and instructors: A review of literature from 2014 to 2016. Lecture Notes in Computer Science, 10254, 176–181.CrossRef
go back to reference Deng, R., Benckendorff, P., & Gannaway, D. (2019). Progress and new directions for teaching and learning in MOOCs. Computers & Education, 129, 48–60.CrossRef Deng, R., Benckendorff, P., & Gannaway, D. (2019). Progress and new directions for teaching and learning in MOOCs. Computers & Education, 129, 48–60.CrossRef
go back to reference Deng, R., Benckendorff, P., & Gannaway, D. (2020a). Learner engagement in MOOCs: Scale development and validation. British Journal of Educational Technology, 51(1), 245–262.CrossRef Deng, R., Benckendorff, P., & Gannaway, D. (2020a). Learner engagement in MOOCs: Scale development and validation. British Journal of Educational Technology, 51(1), 245–262.CrossRef
go back to reference Deng, R., Benckendorff, P., & Gannaway, D. (2020b). Linking learner factors, teaching context, and engagement patterns with MOOC learning outcomes. Journal of Computer Assisted Learning, 36(5), 688–708.CrossRef Deng, R., Benckendorff, P., & Gannaway, D. (2020b). Linking learner factors, teaching context, and engagement patterns with MOOC learning outcomes. Journal of Computer Assisted Learning, 36(5), 688–708.CrossRef
go back to reference Denson, N., Loveday, T., & Dalton, H. (2010). Student evaluation of courses: What predicts satisfaction? Higher Education Research & Development, 29(4), 339–356.CrossRef Denson, N., Loveday, T., & Dalton, H. (2010). Student evaluation of courses: What predicts satisfaction? Higher Education Research & Development, 29(4), 339–356.CrossRef
go back to reference Elliott, K. M., & Shin, D. (2002). Student Satisfaction: An alternative approach to assessing this important concept. Journal of Higher Education Policy and Management, 24(2), 197–209.CrossRef Elliott, K. M., & Shin, D. (2002). Student Satisfaction: An alternative approach to assessing this important concept. Journal of Higher Education Policy and Management, 24(2), 197–209.CrossRef
go back to reference Engle, D., Mankoff, C., & Carbrey, J. (2015). Coursera’s introductory human physiology course: Factors that characterize successful completion of a MOOC. International Review of Research in Open and Distance Learning, 16(2), 46–68.CrossRef Engle, D., Mankoff, C., & Carbrey, J. (2015). Coursera’s introductory human physiology course: Factors that characterize successful completion of a MOOC. International Review of Research in Open and Distance Learning, 16(2), 46–68.CrossRef
go back to reference Entwistle, N. (2018). Predicting academic performance at university. In N. Entwistle (Ed.), Student learning and academic understanding (pp. 29–43). San Diego: Academic Press.CrossRef Entwistle, N. (2018). Predicting academic performance at university. In N. Entwistle (Ed.), Student learning and academic understanding (pp. 29–43). San Diego: Academic Press.CrossRef
go back to reference Erikson, M., Erikson, M. G., & Punzi, E. (2016). Student responses to a reflexive course evaluation. Reflective Practice, 17(6), 663–675.CrossRef Erikson, M., Erikson, M. G., & Punzi, E. (2016). Student responses to a reflexive course evaluation. Reflective Practice, 17(6), 663–675.CrossRef
go back to reference Evans, B., & Baker, R. (2016). MOOCs and persistence: Definitions and predictors. New Directions for Institutional Research, 2015(167), 69–85.CrossRef Evans, B., & Baker, R. (2016). MOOCs and persistence: Definitions and predictors. New Directions for Institutional Research, 2015(167), 69–85.CrossRef
go back to reference Gannaway, D., Green, T., & Mertova, P. (2018). So how big is big? Investigating the impact of class size on ratings in student evaluation. Assessment & Evaluation in Higher Education, 43(2), 175–184.CrossRef Gannaway, D., Green, T., & Mertova, P. (2018). So how big is big? Investigating the impact of class size on ratings in student evaluation. Assessment & Evaluation in Higher Education, 43(2), 175–184.CrossRef
go back to reference Ginns, P., Prosser, M., & Barrie, S. (2007). Students’ perceptions of teaching quality in higher education: the perspective of currently enrolled students. Studies in Higher Education, 32(5), 603–615.CrossRef Ginns, P., Prosser, M., & Barrie, S. (2007). Students’ perceptions of teaching quality in higher education: the perspective of currently enrolled students. Studies in Higher Education, 32(5), 603–615.CrossRef
go back to reference Golding, C., & Adam, L. (2016). Evaluate to improve: Useful approaches to student evaluation. Assessment & Evaluation in Higher Education, 41(1), 1–14.CrossRef Golding, C., & Adam, L. (2016). Evaluate to improve: Useful approaches to student evaluation. Assessment & Evaluation in Higher Education, 41(1), 1–14.CrossRef
go back to reference Grebennikov, L., & Shah, M. (2013). Student voice: using qualitative feedback from students to enhance their university experience. Teaching in Higher Education, 18(6), 606–618.CrossRef Grebennikov, L., & Shah, M. (2013). Student voice: using qualitative feedback from students to enhance their university experience. Teaching in Higher Education, 18(6), 606–618.CrossRef
go back to reference Gutiérrez-Santiuste, E., Gámiz-Sánchez, V.-M., & Gutiérrez-Pérez, J. (2015). MOOC & b-learning: Students’ barriers and satisfaction in formal and non-formal learning environments. Journal of Interactive Online Learning, 13(3), 88–111. Gutiérrez-Santiuste, E., Gámiz-Sánchez, V.-M., & Gutiérrez-Pérez, J. (2015). MOOC & b-learning: Students’ barriers and satisfaction in formal and non-formal learning environments. Journal of Interactive Online Learning, 13(3), 88–111.
go back to reference Han, H., Kiatkawsin, K., Kim, W., & Hong, J. H. (2018). Physical classroom environment and student satisfaction with courses. Assessment & Evaluation in Higher Education, 43(1), 110–125.CrossRef Han, H., Kiatkawsin, K., Kim, W., & Hong, J. H. (2018). Physical classroom environment and student satisfaction with courses. Assessment & Evaluation in Higher Education, 43(1), 110–125.CrossRef
go back to reference Harnash-Glezer, M., & Meyer, J. (1991). Dimensions of satisfaction with college education. Assessment & Evaluation in Higher Education, 16(2), 95–107.CrossRef Harnash-Glezer, M., & Meyer, J. (1991). Dimensions of satisfaction with college education. Assessment & Evaluation in Higher Education, 16(2), 95–107.CrossRef
go back to reference Harwood, I. A., Gapp, R. P., & Stewart, H. J. (2015). Cross-check for completeness: Exploring a novel use of Leximancer in a grounded theory study. The Qualitative Report, 20(7), 1029–1045. Harwood, I. A., Gapp, R. P., & Stewart, H. J. (2015). Cross-check for completeness: Exploring a novel use of Leximancer in a grounded theory study. The Qualitative Report, 20(7), 1029–1045.
go back to reference Hassel, H., & Lourey, J. (2005). The dea(r)th of student responsibility. College Teaching, 53(1), 2–13.CrossRef Hassel, H., & Lourey, J. (2005). The dea(r)th of student responsibility. College Teaching, 53(1), 2–13.CrossRef
go back to reference Haynes, E., Garside, R., Green, J., Kelly, M. P., Thomas, J., & Guell, C. (2019). Semiautomated text analytics for qualitative data synthesis. Research Synthesis Methods, 10(3), 452–464.CrossRef Haynes, E., Garside, R., Green, J., Kelly, M. P., Thomas, J., & Guell, C. (2019). Semiautomated text analytics for qualitative data synthesis. Research Synthesis Methods, 10(3), 452–464.CrossRef
go back to reference Hendry, A. M. (1983). Measuring adult student satisfaction: A model. Canadian Vocational Journal, 19(1), 47–50.MathSciNet Hendry, A. M. (1983). Measuring adult student satisfaction: A model. Canadian Vocational Journal, 19(1), 47–50.MathSciNet
go back to reference Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education, 90, 36–53.CrossRef Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education, 90, 36–53.CrossRef
go back to reference Herrington, A., & Herrington, J. (2006). Authentic learning environments in higher education. Hershey, PA: Information Science Publishing.CrossRef Herrington, A., & Herrington, J. (2006). Authentic learning environments in higher education. Hershey, PA: Information Science Publishing.CrossRef
go back to reference Herrington, J., Reeves, T. C., & Oliver, R. (2014). Authentic learning environments. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (pp. 401–412). New York, NY: Springer.CrossRef Herrington, J., Reeves, T. C., & Oliver, R. (2014). Authentic learning environments. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (pp. 401–412). New York, NY: Springer.CrossRef
go back to reference Hew, K. F. (2015). Promoting engagement in online courses: What strategies can we learn from three highly rated MOOCS. British Journal of Educational Technology, 47(2), 320–341.CrossRef Hew, K. F. (2015). Promoting engagement in online courses: What strategies can we learn from three highly rated MOOCS. British Journal of Educational Technology, 47(2), 320–341.CrossRef
go back to reference Hone, K. S., & El Said, G. R. (2016). Exploring the factors affecting MOOC retention: A survey study. Computers & Education, 98, 157–168.CrossRef Hone, K. S., & El Said, G. R. (2016). Exploring the factors affecting MOOC retention: A survey study. Computers & Education, 98, 157–168.CrossRef
go back to reference Hu, H., Zhang, G., Gao, W., & Wang, M. (2020). Big data analytics for MOOC video watching behavior based on Spark. Neural Computing and Applications, 32(11), 6481–6489.CrossRef Hu, H., Zhang, G., Gao, W., & Wang, M. (2020). Big data analytics for MOOC video watching behavior based on Spark. Neural Computing and Applications, 32(11), 6481–6489.CrossRef
go back to reference James, L. T., & Casidy, R. (2018). Authentic assessment in business education: Its effects on student satisfaction and promoting behaviour. Studies in Higher Education, 43(3), 401–415.CrossRef James, L. T., & Casidy, R. (2018). Authentic assessment in business education: Its effects on student satisfaction and promoting behaviour. Studies in Higher Education, 43(3), 401–415.CrossRef
go back to reference Johnson, V. E. (2003). Grade inflation: A crisis in college education. New York, NY: Springer. Johnson, V. E. (2003). Grade inflation: A crisis in college education. New York, NY: Springer.
go back to reference Joo, Y. J., So, H.-J., & Kim, N. H. (2018). Examination of relationships among students’ self-determination, technology acceptance, satisfaction, and continuance intention to use K-MOOCs. Computers & Education, 122, 260–272.CrossRef Joo, Y. J., So, H.-J., & Kim, N. H. (2018). Examination of relationships among students’ self-determination, technology acceptance, satisfaction, and continuance intention to use K-MOOCs. Computers & Education, 122, 260–272.CrossRef
go back to reference Jung, E., Kim, D., Yoon, M., Park, S., & Oakley, B. (2019). The influence of instructional design on learner control, sense of achievement, and perceived effectiveness in a supersize MOOC course. Computers & Education, 128, 377–388.CrossRef Jung, E., Kim, D., Yoon, M., Park, S., & Oakley, B. (2019). The influence of instructional design on learner control, sense of achievement, and perceived effectiveness in a supersize MOOC course. Computers & Education, 128, 377–388.CrossRef
go back to reference Khalil, H., & Ebner, M. (2013). “How satisfied are you with your MOOC?” - A research study on interaction in huge online courses. In J. Herrington, A. Couros, & V. Irvine (Eds.), Proceedings of EdMedia 2013 - World Conference on Educational Media and Technology (pp. 830–839). Victoria, Canada: Association for the Advancement of Computing in Education. Khalil, H., & Ebner, M. (2013). “How satisfied are you with your MOOC?” - A research study on interaction in huge online courses. In J. Herrington, A. Couros, & V. Irvine (Eds.), Proceedings of EdMedia 2013 - World Conference on Educational Media and Technology (pp. 830–839). Victoria, Canada: Association for the Advancement of Computing in Education.
go back to reference Kizilcec, R. F., Pérez-Sanagustín, M., & Maldonado, J. J. (2016). Recommending self-regulated learning strategies does not improve performance in a MOOC. Paper presented at the Proceedings of the Third (2016) ACM Conference on Learning @ Scale, Edinburgh, Scotland, UK. Kizilcec, R. F., Pérez-Sanagustín, M., & Maldonado, J. J. (2016). Recommending self-regulated learning strategies does not improve performance in a MOOC. Paper presented at the Proceedings of the Third (2016) ACM Conference on Learning @ Scale, Edinburgh, Scotland, UK.
go back to reference Kovanović, V., Joksimović, S., Poquet, O., Hennis, T., Čukić, I., de Vries, P., & Gašević, D. (2018). Exploring communities of inquiry in Massive Open Online Courses. Computers & Education, 119, 44–58.CrossRef Kovanović, V., Joksimović, S., Poquet, O., Hennis, T., Čukić, I., de Vries, P., & Gašević, D. (2018). Exploring communities of inquiry in Massive Open Online Courses. Computers & Education, 119, 44–58.CrossRef
go back to reference Kozinets, R. V. (2009). Netnography: Doing ethnographic research online. London: SAGE. Kozinets, R. V. (2009). Netnography: Doing ethnographic research online. London: SAGE.
go back to reference Krehbiel, T. C., McClure, R. H., & Pratsini, E. (1997). Using student disconfirmation as a measure of classroom effectiveness. Journal of Education for Business, 72(4), 224–229.CrossRef Krehbiel, T. C., McClure, R. H., & Pratsini, E. (1997). Using student disconfirmation as a measure of classroom effectiveness. Journal of Education for Business, 72(4), 224–229.CrossRef
go back to reference Landrum, R. E., Hood, J. T. A., & McAdams, J. M. (2001). Satisfaction with college by traditional and nontraditional college students. Psychological Reports, 89(3), 740–746.CrossRef Landrum, R. E., Hood, J. T. A., & McAdams, J. M. (2001). Satisfaction with college by traditional and nontraditional college students. Psychological Reports, 89(3), 740–746.CrossRef
go back to reference Li, Q., & Baker, R. (2018). The different relationships between engagement and outcomes across participant subgroups in Massive Open Online Courses. Computers & Education, 127, 41–65.CrossRef Li, Q., & Baker, R. (2018). The different relationships between engagement and outcomes across participant subgroups in Massive Open Online Courses. Computers & Education, 127, 41–65.CrossRef
go back to reference Lizzio, A., Wilson, K., & Simons, R. (2002). University students’ perceptions of the learning environment and academic outcomes: Implications for theory and practice. Studies in Higher Education, 27(1), 27–52.CrossRef Lizzio, A., Wilson, K., & Simons, R. (2002). University students’ perceptions of the learning environment and academic outcomes: Implications for theory and practice. Studies in Higher Education, 27(1), 27–52.CrossRef
go back to reference MacNell, L., Driscoll, A., & Hunt, A. N. (2015). What’s in a name: Exposing gender bias in student ratings of teaching. Innovative Higher Education, 40(4), 291–303.CrossRef MacNell, L., Driscoll, A., & Hunt, A. N. (2015). What’s in a name: Exposing gender bias in student ratings of teaching. Innovative Higher Education, 40(4), 291–303.CrossRef
go back to reference Manallack, D. T., & Yuriev, E. (2016). Ten simple rules for developing a MOOC. PLOS Computational Biology, 12, 10.CrossRef Manallack, D. T., & Yuriev, E. (2016). Ten simple rules for developing a MOOC. PLOS Computational Biology, 12, 10.CrossRef
go back to reference McCulloch, A. (2009). The student as co-producer: learning from public administration about the student–university relationship. Studies in Higher Education, 34(2), 171–183.CrossRef McCulloch, A. (2009). The student as co-producer: learning from public administration about the student–university relationship. Studies in Higher Education, 34(2), 171–183.CrossRef
go back to reference Meek, S. E. M., Blakemore, L., & Marks, L. (2016). Is peer review an appropriate form of assessment in a MOOC? Student participation and performance in formative peer review. Assessment & Evaluation in Higher Education, 12, 1–14. Meek, S. E. M., Blakemore, L., & Marks, L. (2016). Is peer review an appropriate form of assessment in a MOOC? Student participation and performance in formative peer review. Assessment & Evaluation in Higher Education, 12, 1–14.
go back to reference Mikulić, J., Dužević, I., & Baković, T. (2015). Exploring drivers of student satisfaction and dissatisfaction: an assessment of impact-asymmetry and impact-range. Total Quality Management & Business Excellence, 26(11–12), 1213–1225.CrossRef Mikulić, J., Dužević, I., & Baković, T. (2015). Exploring drivers of student satisfaction and dissatisfaction: an assessment of impact-asymmetry and impact-range. Total Quality Management & Business Excellence, 26(11–12), 1213–1225.CrossRef
go back to reference Moskal, P., Thompson, K., & Futch, L. (2015). Enrollment, engagement, and satisfaction in the BlendKit faculty development open, online course. Online Learning, 19(4), 1–12.CrossRef Moskal, P., Thompson, K., & Futch, L. (2015). Enrollment, engagement, and satisfaction in the BlendKit faculty development open, online course. Online Learning, 19(4), 1–12.CrossRef
go back to reference Nasser, F., & Hagtvet, K. A. (2006). Multilevel analysis of the effects of student and instructor/course characteristics on student ratings. Research in Higher Education, 47(5), 559–590.CrossRef Nasser, F., & Hagtvet, K. A. (2006). Multilevel analysis of the effects of student and instructor/course characteristics on student ratings. Research in Higher Education, 47(5), 559–590.CrossRef
go back to reference Nasser-Abu Alhija, F., & Fresko, B. (2018). Graduate teaching assistants: How well do their students think they do? Assessment & Evaluation in Higher Education, 25, 1–12. Nasser-Abu Alhija, F., & Fresko, B. (2018). Graduate teaching assistants: How well do their students think they do? Assessment & Evaluation in Higher Education, 25, 1–12.
go back to reference Ninaus, M., Greipl, S., Kiili, K., Lindstedt, A., Huber, S., Klein, E., & Moeller, K. (2019). Increased emotional engagement in game-based learning—a machine learning approach on facial emotion detection data. Computers & Education, 142, 1–10.CrossRef Ninaus, M., Greipl, S., Kiili, K., Lindstedt, A., Huber, S., Klein, E., & Moeller, K. (2019). Increased emotional engagement in game-based learning—a machine learning approach on facial emotion detection data. Computers & Education, 142, 1–10.CrossRef
go back to reference Ntourmas, A., Avouris, N., Daskalaki, S., & Dimitriadis, Y. (2019). Teaching assistants in MOOCs forums: Omnipresent interlocutors or knowledge facilitators. Paper presented at the 14th European Conference on Technology Enhanced Learning, Cham. Ntourmas, A., Avouris, N., Daskalaki, S., & Dimitriadis, Y. (2019). Teaching assistants in MOOCs forums: Omnipresent interlocutors or knowledge facilitators. Paper presented at the 14th European Conference on Technology Enhanced Learning, Cham.
go back to reference Olssen, M., & Peters, M. A. (2005). Neoliberalism, higher education and the knowledge economy: from the free market to knowledge capitalism. Journal of Education Policy, 20(3), 313–345.CrossRef Olssen, M., & Peters, M. A. (2005). Neoliberalism, higher education and the knowledge economy: from the free market to knowledge capitalism. Journal of Education Policy, 20(3), 313–345.CrossRef
go back to reference Ory, J. C., Braskamp, L. A., & Pieper, D. M. (1980). Congruency of student evaluative information collected by three methods. Journal of Educational Psychology, 72(2), 181–185.CrossRef Ory, J. C., Braskamp, L. A., & Pieper, D. M. (1980). Congruency of student evaluative information collected by three methods. Journal of Educational Psychology, 72(2), 181–185.CrossRef
go back to reference Penela, D., & Serrasqueiro, R. M. (2019). Identification of risk factors in the hospitality industry: Evidence from risk factor disclosure. Tourism Management Perspectives, 32, 1–10.CrossRef Penela, D., & Serrasqueiro, R. M. (2019). Identification of risk factors in the hospitality industry: Evidence from risk factor disclosure. Tourism Management Perspectives, 32, 1–10.CrossRef
go back to reference Phan, T., McNeil, S. G., & Robin, B. R. (2016). Students’ patterns of engagement and course performance in a Massive Open Online Course. Computers & Education, 95, 36–44.CrossRef Phan, T., McNeil, S. G., & Robin, B. R. (2016). Students’ patterns of engagement and course performance in a Massive Open Online Course. Computers & Education, 95, 36–44.CrossRef
go back to reference Poquet, O., Jovanovic, J., & Dawson, S. (2020). Differences in forum communication of residents and visitors in MOOCS. Computers & Education, 103, 937. Poquet, O., Jovanovic, J., & Dawson, S. (2020). Differences in forum communication of residents and visitors in MOOCS. Computers & Education, 103, 937.
go back to reference Richardson, J. T. E., Slater, J. B., & Wilson, J. (2007). The National Student Survey: Development, findings and implications. Studies in Higher Education, 32(5), 557–580.CrossRef Richardson, J. T. E., Slater, J. B., & Wilson, J. (2007). The National Student Survey: Development, findings and implications. Studies in Higher Education, 32(5), 557–580.CrossRef
go back to reference Riina, K., & Petri, N. (2015). The student-customer orientation questionnaire (SCOQ): Application of customer metaphor to higher education. International Journal of Educational Management, 29(1), 115–138. Riina, K., & Petri, N. (2015). The student-customer orientation questionnaire (SCOQ): Application of customer metaphor to higher education. International Journal of Educational Management, 29(1), 115–138.
go back to reference Rodrigues, H., Almeida, F., Figueiredo, V., & Lopes, S. L. (2019). Tracking e-learning through published papers: A systematic review. Computers & Education, 136, 87–98.CrossRef Rodrigues, H., Almeida, F., Figueiredo, V., & Lopes, S. L. (2019). Tracking e-learning through published papers: A systematic review. Computers & Education, 136, 87–98.CrossRef
go back to reference Schiefele, U., & Csikszentmihalyi, M. (1994). Interest and the quality of experience in classrooms. European Journal of Psychology of Education, 9(3), 251–269.CrossRef Schiefele, U., & Csikszentmihalyi, M. (1994). Interest and the quality of experience in classrooms. European Journal of Psychology of Education, 9(3), 251–269.CrossRef
go back to reference Silvia, P. J. (2008). Interest—The curious rmotion. Current Directions in Psychological Science, 17(1), 57–60.CrossRef Silvia, P. J. (2008). Interest—The curious rmotion. Current Directions in Psychological Science, 17(1), 57–60.CrossRef
go back to reference Spooren, P. (2010). On the credibility of the judge: A cross-classified multilevel analysis on students’ evaluation of teaching. Studies in Educational Evaluation, 36(4), 121–131.CrossRef Spooren, P. (2010). On the credibility of the judge: A cross-classified multilevel analysis on students’ evaluation of teaching. Studies in Educational Evaluation, 36(4), 121–131.CrossRef
go back to reference Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching: The state of the art. Review of Educational Research, 83(4), 598–642.CrossRef Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching: The state of the art. Review of Educational Research, 83(4), 598–642.CrossRef
go back to reference Spyropoulou, N., Pierrakeas, C., & Kameas, A. (2014). Creating MOOC guidelines based on best practices. Paper presented at the 6th international conference on education and new learning technologies, Barcelona. Spyropoulou, N., Pierrakeas, C., & Kameas, A. (2014). Creating MOOC guidelines based on best practices. Paper presented at the 6th international conference on education and new learning technologies, Barcelona.
go back to reference Stewart, B., Speldewinde, P., & Ford, B. (2018). Influence of improved teaching practices on student satisfaction ratings for two undergraduate units at an Australian university. Assessment & Evaluation in Higher Education, 43(4), 598–611.CrossRef Stewart, B., Speldewinde, P., & Ford, B. (2018). Influence of improved teaching practices on student satisfaction ratings for two undergraduate units at an Australian university. Assessment & Evaluation in Higher Education, 43(4), 598–611.CrossRef
go back to reference Steyn, C., Davies, C., & Sambo, A. (2019). Eliciting student feedback for course development: the application of a qualitative course evaluation tool among business research students. Assessment & Evaluation in Higher Education, 44(1), 11–24.CrossRef Steyn, C., Davies, C., & Sambo, A. (2019). Eliciting student feedback for course development: the application of a qualitative course evaluation tool among business research students. Assessment & Evaluation in Higher Education, 44(1), 11–24.CrossRef
go back to reference Stich, A. E., & Reeves, T. D. (2017). Massive open online courses and underserved students in the United States. The Internet and Higher Education, 32, 58–71.CrossRef Stich, A. E., & Reeves, T. D. (2017). Massive open online courses and underserved students in the United States. The Internet and Higher Education, 32, 58–71.CrossRef
go back to reference Stöhr, C., Stathakarou, N., Mueller, F., Nifakos, S., & McGrath, C. (2019). Videos as learning objects in MOOCs: A study of specialist and non-specialist participants’ video activity in MOOCs. British Journal of Educational Technology, 50(1), 166–176.CrossRef Stöhr, C., Stathakarou, N., Mueller, F., Nifakos, S., & McGrath, C. (2019). Videos as learning objects in MOOCs: A study of specialist and non-specialist participants’ video activity in MOOCs. British Journal of Educational Technology, 50(1), 166–176.CrossRef
go back to reference Stupans, I., McGuren, T., & Babey, A. M. (2016). Student evaluation of teaching: A study exploring student rating instrument free-form text comments. Innovative Higher Education, 41(1), 33–42.CrossRef Stupans, I., McGuren, T., & Babey, A. M. (2016). Student evaluation of teaching: A study exploring student rating instrument free-form text comments. Innovative Higher Education, 41(1), 33–42.CrossRef
go back to reference Tan, F. D. H., Whipp, P. R., Gagné, M., & Van Quaquebeke, N. (2019). Students’ perception of teachers’ two-way feedback interactions that impact learning. Social Psychology of Education, 22(1), 169–187.CrossRef Tan, F. D. H., Whipp, P. R., Gagné, M., & Van Quaquebeke, N. (2019). Students’ perception of teachers’ two-way feedback interactions that impact learning. Social Psychology of Education, 22(1), 169–187.CrossRef
go back to reference Walji, S., Deacon, A., Small, J., & Czerniewicz, L. (2016). Learning through engagement: MOOCs as an emergent form of provision. Distance Education, 37(2), 208–223.CrossRef Walji, S., Deacon, A., Small, J., & Czerniewicz, L. (2016). Learning through engagement: MOOCs as an emergent form of provision. Distance Education, 37(2), 208–223.CrossRef
go back to reference Walker, D. J., & Palmer, E. (2011). The relationship between student understanding, satisfaction and performance in an Australian engineering programme. Assessment & Evaluation in Higher Education, 36(2), 157–170.CrossRef Walker, D. J., & Palmer, E. (2011). The relationship between student understanding, satisfaction and performance in an Australian engineering programme. Assessment & Evaluation in Higher Education, 36(2), 157–170.CrossRef
go back to reference Watson, W. R., Kim, W., & Watson, S. L. (2016). Learning outcomes of a MOOC designed for attitudinal change: A case study of an Animal Behavior and Welfare MOOC. Computers & Education, 96, 83–93.CrossRef Watson, W. R., Kim, W., & Watson, S. L. (2016). Learning outcomes of a MOOC designed for attitudinal change: A case study of an Animal Behavior and Welfare MOOC. Computers & Education, 96, 83–93.CrossRef
go back to reference Worthington, A. C. (2002). The impact of student perceptions and characteristics on teaching evaluations: A case study in finance education. Assessment & Evaluation in Higher Education, 27(1), 49–64.CrossRef Worthington, A. C. (2002). The impact of student perceptions and characteristics on teaching evaluations: A case study in finance education. Assessment & Evaluation in Higher Education, 27(1), 49–64.CrossRef
go back to reference Yousef, A., Wahid, U., Chatti, M., Schroeder, U., & Wosnitza, M. (2015). The effect of peer assessment rubrics on learners' satisfaction and performance within a blended MOOC environment. Proceedings of the 7th International Conference on Computer Supported Education, 2, 148–159. Yousef, A., Wahid, U., Chatti, M., Schroeder, U., & Wosnitza, M. (2015). The effect of peer assessment rubrics on learners' satisfaction and performance within a blended MOOC environment. Proceedings of the 7th International Conference on Computer Supported Education, 2, 148–159.
go back to reference Zafra-Gómez, J. L., Román-Martínez, I., & Gómez-Miranda, M. E. (2015). Measuring the impact of inquiry-based learning on outcomes and student satisfaction. Assessment & Evaluation in Higher Education, 40(8), 1050–1069.CrossRef Zafra-Gómez, J. L., Román-Martínez, I., & Gómez-Miranda, M. E. (2015). Measuring the impact of inquiry-based learning on outcomes and student satisfaction. Assessment & Evaluation in Higher Education, 40(8), 1050–1069.CrossRef
Metadata
Title
What are the key themes associated with the positive learning experience in MOOCs? An empirical investigation of learners’ ratings and reviews
Authors
Ruiqi Deng
Pierre Benckendorff
Publication date
01-12-2021
Publisher
Springer International Publishing
DOI
https://doi.org/10.1186/s41239-021-00244-3

Other articles of this Issue 1/2021

International Journal of Educational Technology in Higher Education 1/2021 Go to the issue

Premium Partner