Skip to main content
Top
Published in: International Journal of Educational Technology in Higher Education 1/2019

Open Access 01-12-2019 | Research article

Online module login data as a proxy measure of student engagement: the case of myUnisa, MoyaMA, Flipgrid, and Gephi at an ODeL institution in South Africa

Authors: Chaka Chaka, Tlatso Nkhobo

Published in: International Journal of Educational Technology in Higher Education | Issue 1/2019

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The current study employed online module login data harvested from three tools, myUnisa, MoyaMA and Flipgrid to determine how such data served as a proxy measure of student engagement. The first tool is a legacy learning management system (LMS) utilised for online learning at the University of South Africa (UNISA), while the other two tools are a mobile messaging application and an educational video discussion platform, respectively. In this regard, the study set out to investigate the manner in which module login data of undergraduate students (n = 3475 & n = 2954) and a cohort of Mathew Goniwe students (n = 27) enrolled for a second-level module, ENG2601, as extracted from myUnisa, MoyaMA, and Flipgrid served as a proxy measure of student engagement. Collectively, these students were registered for this second-level module at UNISA at the time the study was conducted. The online login data comprised myUnisa module login file access frequencies. In addition, the online login data consisted of the frequencies of instant messages (IMs) posted on MoyaMA by both the facilitator and Mathew Goniwe students, and video clips posted on and video clip view frequencies captured by Flipgrid in respect of the afore-cited module. One finding of this study is that student engagement as measured by login file access frequencies was disproportionally skewed toward one module file relative to other module files. The other finding of this study is that the overall module file access metrics of the Mathew Goniwe group were disproportionally concentrated in a sub-cohort of highly active users (HAU).
Notes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Introduction

There has been a growing number of research studies on student engagement as it applies to online and blended learning especially at higher education institutions (HIEs) by scholars from different regions across the globe. Among scholars who have conducted studies on this area are Beer, Clark, and Jones (2010); Bodily, Graham, and Bush (2017); Boulton, Kent, and Williams (2018); Chen, Chang, Ouyang, and Zhou (2018); Henrie, Bodily, Manwaring, & Graham, 2015, Henrie, Halverson, & Graham, 2015); Henrie, Bodily, Larsen, and Graham (2018); Holmes (2018); Hussain, Zhu, Zhang, and Abidi (2018); Liu, Froissard, Richards, and Atif (2015); Shelton, Hung, and Lowenthal (2017); Sheringham, Lyon, Jones, Strobl, and Barratt (2016); Sinclair, Butler, Morgan, and Kalvala (2015); and Vogt (2016). Some of these studies have been conducted in online learning environments (OLEs) (Boulton et al., 2018; Chen, Lambert, & Guidry, 2010; Dixson, 2010; Henrie et al., 2018; Herrington, Oliver, & Reeves, 2003; Kahn, Everington, Kelm, Reid, & Watkins, 2017; Martin & Bolliger, 2018; Robinson & Hullinger, 2008; Shelton et al., 2017; Vogt, 2016). By contrast, others have been carried out in blended learning contexts (Conijn, Snijders, Kleingeld, & Matzat, 2017; Delialioğlu, 2012; Halverson, 2016; Henrie, Bodily, et al., 2015; Manwaring, Larsen, Graham, Henrie, & Halverson, 2017; Vaughan, 2014).
In the case of online and blended learning environments, learning management systems (LMSs) have served as anchor points around which to determine the presence or lack of student engagement. In this case, student LMS log data traces are some of the variables employed to measure student engagement (Agudo-Peregrina, Iglesias-Pradas, Conde-González, & Hernández-García, 2014; Beer et al., 2010; Berman & Artino Jr., 2018; Chen et al., 2018; Cocea & Weibelzahl, 2011; Dixson, 2015; Henrie et al., 2018; Henrie, Bodily, et al., 2015; Henrie, Halverson, & Graham, 2015; Holmes, 2018; Liu et al., 2015; Macfadyen & Dawson, 2010; Mogus, Djurdjevic, & Suvak, 2012; Park, 2015; Strang, 2016; Vogt, 2016). In certain instances, student self-reports in the form of surveys are utilised as instruments to measure student engagement metrics. Examples of such student engagement surveys are the National Survey of Student Engagement (NSSE), the Student Course Engagement Questionnaire (SCEQ), and the Online Student Engagement Scale (OSE) (Dixson, 2015; see Bodily et al., 2017; Boulton et al., 2018; Henrie, Bodily, et al., 2015; Henrie et al., 2018; Lu et al., 2017). It is worth mentioning that elsewhere Fredricks et al. (2011) list 21 student engagement instruments used in respect of upper elementary to high schools.
Currently, there seems to be no consensus as to the common definition of student engagement (see Azvedo, 2015; Beer et al., 2010; Bodily et al., 2017; Christenson, Reschly, & Wylie, 2012; Dixson, 2015; Henrie et al., 2018; Henrie, Bodily, et al., 2015; Reschly & Christenson, 2012). Nonetheless, it is often accepted that in OLEs, student engagement with online courses is one of the critical indicators of student success (Agudo-Peregrina et al., 2014; Beer et al., 2010; Ramos & Yudko, 2008; Zepke, 2013) and academic achievement (Atherton et al., 2017; Sinatra, Heddy, & Lombardi, 2015). It is also argued that student engagement (with online courses) is positively correlated to students’ higher final grades (Beer et al., 2010; Boulton et al., 2018; Mogus et al., 2012), and to satisfaction and persistence (Chen, Gonyea, & Kuhn, 2008; Martin & Bolliger, 2018; Morris, Finnegan, & Wu, 2005; Sinatra et al., 2015). In contrast, other researchers have found diverse differentials or little correlation between student engagement and the foregoing variables (Conijn et al., 2017; cf. Mwalumbwe & Mtebe, 2017).
In other instances, student engagement as reflected by LMS log data has been represented through network visualisations in order to work out its visual patterns. Some of these network representations are in the form of social networks as mediated by external, third-party, open source software programmes such as Gephi, SNAPP and GraphFES. Examples of scholars involved in this area of focus include Badge, Saunders, and Cann (2012), Conde, Hérnandez-García, García-Peñalvo, & Séin-Echaluce, 2015; Hernández-García and Suárez-Navas (2017), Hernández-García, González-González, Jiménez Zarco, and Chaparro-Peláez (2016), Hernández-García, Acquila-Natale, and Chaparro-Peláez (2017), Poon, Kong, Yau, Wong, and Ling (2017), and Rabbany, Takaffoli, & Zaïane, 2011). In this context, some of the benefits of visualising student engagement login data recorded and stored on LMSs in terms of social network representations have been reported. These include allowing one to apply social network analysis (SNA) to the log data at hand (Hernández-García et al., 2016; Hernández-García & Suárez-Navas, 2017) and helping one make sense of how students engage with the content, with the teacher and with each other (Barría, Scheihing, & Parra, 2014; Hernández-García et al., 2016; Liu et al., 2015). Others include predicting student outcomes such as increased student engagement with course content and with peers (Atherton et al., 2017).
Against this background, the current paper reports on a study that explored and visualised module-level login data of the module, Applied English Language Studies: Further Explorations (ENG2601) as a proxy measure of student engagement. This module is offered as a level-two undergraduate module at an open and distance e-learning (ODeL) institution, in South Africa. The study visually represented student engagement log data from three applications, myUnisa, Moya Messenger App (MoyaMA), and Flipgrid, by using Gephi. myUnisa is UNISA’s legacy learning management system, while MoyaMA and Flipgrid are an instant messenger and a video discussion tool, respectively. Thus, these three applications offer different affordances in line with the technological functions they serve. To this effect, the main purpose of the study was to establish the extent to which student module-level login data visualisations as generated by Gephi served as a proxy measure of student engagement.

Framing student engagement

As a construct, student engagement has been framed and conceptualised in multiple ways. These multiple framings have been an integral part of the evolutionary trajectory of student engagement. Typologically, it has been configured and represented, initially, as a two-dimensional construct (Astin, 1999; Finn, 1989; Hu & Li, 2017; Mosher & MacGowan, 1985;  Reschly & Christenson, 2012). Later on, it has been represented as a three-dimensional construct (Hu & Li, 2017; Reschly & Christenson, 2012; Yazzie-Mintz, 2007), and as a four-dimensional construct (Appleton, Christenson, & Furlong, 2008; Hu & Li, 2017; Reschly & Christenson, 2012). Its three-dimensional model often comprises emotional engagement, behavioural engagement and cognitive engagement, while in its four-dimensional framing it consists of the three aforesaid variables and academic engagement (Reschly & Christenson, 2006, 2012).
As Reschly and Christenson (2012) point out, there is often an ongoing debate, first, about defining engagement as a construct, and, second, about defining it in terms of the specific dimensions and dimensional sub-types of which it consists. Allied to this definitional minefield is the fact that student engagement is variously referred to as academic engagement, school engagement, student engaged learning, engaged time, academic responding, engagement in course work, and engagement in class (Fredricks et al., 2011; Williams & Whiting, 2016). All of this adds to a jingle-jangle scenario: a jingle as student engagement happens to be used to refer to different things, simultaneously; and a jangle since different terms are used to refer to it (Reschly & Christenson et al., 2012). Moreover, there are different tools and instruments used to measure student engagement. In fact, Fredricks et al. (2011) identify 21 such instruments (14 student self-report instruments, 4 observation instruments, and 3 teacher report instruments) employed to measure student engagement from upper elementary to high school. They further point out that of the 14 student self-report instruments, 5 measured three dimensions (emotional, behavioural and cognitive dimensions), 5 assessed two dimensions, while 4 measured one dimension.
But overall, a dominant view is that student engagement is multi-dimensional, comprising essential variables such as emotion, behaviour and cognition. This is a tripartite typology of student engagement that theorises engagement as a meta-construct. Some of the leading proponents of this dominant model of student engagement are Fredricks, Blumenfeld, and Paris (2004) and Fredricks et al. (2011). Notwithstanding this dominant view, there is a lack of consensus concerning the definition of the sub-types comprising engagement, something that parallels the definition of student engagement itself as a construct. Another view that is gaining currency, though not dominant, is a four-dimensional model of student engagement mentioned earlier. Some of its advocates are Reschly and Christenson (2006, 2012). This view theorises student engagement as a meta-construct that subsumes sub-constructs or sub-types - in addition to the four classical ones (e.g., emotional, behavioural, cognitive and academic engagement) - such as motivational, psychological, contextual, social and agentive dimensions (Bodily et al., 2017; Henrie et al., 2018; Henrie, Halverson, & Graham, 2015; Reschly & Christenson, 2006, 2012; Sinatra et al., 2015). To this effect, Reschly and Christenson (2012) distinguish between indicators and facilitators of student engagement. Indicators are targeted students, and facilitators - also called contextual facilitators of engagement - are various contexts in which student engagement takes places. Moreover, Dixson (2015) differentiates between two main types of student behaviours: observational learning behaviours and application learning behaviours. The former entail reading e-mails, reading discussion forum posts, and viewing content and documents; the latter involve writing e-mails, posting to discussion forums, and responding to quizzes.
The current paper espouses a four-dimensional model of student engagement as proposed by Reschly and Christenson (2006, 2012). It does so within an OLE pertaining to an open and distance e-learning (ODeL) institution whose module offerings are delivered on a learning management system (LMS), myUnisa, in South Africa, even though it has a space for a blended mode. In this context, it investigated one dimension of student engagement – the behavioural dimension.

Student LMS login data as a proxy measure of student engagement

Recently, there have been studies investigating student login data captured by and stored on LMSs, and treating these data as a proxy measure of student engagement in the higher education (HE) sector (Beer et al., 2010; Henrie et al., 2018; Zepke, 2013). Of course, there are scholars espousing a contrary view such as Williams and Whiting (2016) and You (2016). Of the former studies – and with which the current study aligns itself – there are those that have been undertaken at an online course level focusing on an activity level. Two such studies, which have relevance for the current study, are worth briefly delineating. The first is Pazzaglia, Clements, Lavigne, and Stafford (2016) study. Even though conducted at a virtual school level, this study examined student LMS data and student information system data for all online courses offered by Wisconsin Virtual School in the United States of America. It focused on: student engagement patterns in these online courses and the ensuing student enrolment percentage; differentials among student engagement groups in course type taken, grade level, or gender; and the link between student engagement patterns in online learning and online course outcomes. For this study, engagement referred to behavioural engagement and was conceptualised as the amount of time students were logged into online course per week. Course outcomes were determined by the percentage of possible points scored in each course and the percentage of completed course activities. Among engagement patterns identified were steady engagement, consistent engagement and high but variable engagement. Overall, students whose online course engagement lasted at least 1.5 h per week scored enough percentage of possible points to pass the course, notwithstanding their varied total login time in each week initially and across the semester (Pazzaglia et al., 2016).
The second study is Beer’s (2010), which investigated student engagement data captured by a Blackboard LMS at CQUniversity in Australia. It focused on 2674 courses offered to undergraduate online students. Staff and student activity metrics were summarised and aggregated into two relational LMS databases through a series of scripts. One database comprised, among other things, student numbers, staff activity counts and staff discussion forum activity counts. The other database consisted of each student’s course, activity counts, grade, age, gender, and discussion forum activity counts. The aggregation of student data allowed for the data to be analysed and compared with student grade information in order to determine the value of certain LMS activity patterns and relationships pertaining to student success. It also allowed for analysing data across time to establish the changes of staff and students’ usage of the LMS over time. Despite inherent limitations for using student activity patterns based on analysing archival LMS data, one of which is that such student activity patterns only serve as indicators of what is actually occurring as opposed to gauging the actual learning itself, there are advantages for utilising archival student LMS data. One advantage as reported by Beer et al. (2010) is that student LMS activity data serves as an indicator of student engagement, or of student participation. In this regard, Beer et al. (2010) further argue that even though a causative correlation between LMS clicks and student engagement has not been established, there is some relationship between the number of student course clicks on a LMS and a resultant student grade.

Social network visual representations of student LMS login data as a proxy measure of student engagement

Student login and activity data as curated on LMSs and as a proxy indicator of student engagement is increasingly being represented by social network visualisations. In this case, social networks tend to transition from being used for traditional purposes to being utilised to visualise student engagement data that are often represented in a static form by LMSs. Jimoyiannis, Tsiotakis, and Roussinos (2013) contend that social network analysis (SNA) is one of the effective tools for exploring student engagement patterns in OLEs. They also maintain that through SNA, engagement patterns can be presented in visual graphs.
Two studies that utilised social network tools to visually represent student LMS engagement data are of relevance to the current paper. The first one, by Badge et al. (2012), used what it calls student academic contributions to a public social network as a proxy for student engagement. It took place at one university in the United Kingdom, and involved a cohort of undergraduate students in the School of Biological Sciences in the 2009/2010 academic year. Employing the Friendfeed social network (Friendfeed.​com) linked to the institutional VLE, the study focused on student contributions in the form of status update entries, likes, comments on others’ updates, or shared links. Module activities were allotted marks. All relevant activity data from Friendfeed were extracted using the Friendfeed application programming interface and exported to Gephi, an open-source tool, to visualise network graphs and to statistically analyse the dataset. Three types of network, subscriptions, comments and likes, were visually represented, and were used to visualise student network interactions. In addition, three resultant student network communities (relationships) that emerged from student interactions were: a very active group (where a lot of interaction took place); a smaller group (who preferred to talk among themselves); and a mid-sized group (who chose to remain on the periphery). One of the findings of this study is that students established connections to one another as subscribers and by commenting on each other’s contributions to the network. As such, the crucial aspect that is relevant to the current paper which emerges from this study is its conclusion that the network data captured by Friendfeed and the visualisation of such data from Gephi serve as a simple but powerful proxy for student engagement (Badge et al., 2012).
The second study is Hernández-García et al.’s (2016) that was undertaken at the Open University of Catalonia in Spain. It employed student login data sourced from undergraduate students enrolled for a semester-long course, Introduction to financial information, between September 2013 and January 2014. The aforesaid data was extracted from a legacy LMS’ activity log. Three datasets were generated from the original data extracted from the system log: data on visible student interactions showing active and non-active participants; data on invisible student interactions displaying learning witnesses, passive learners and most read learners; and popular topics and discussions instantiating student engagement. These datasets were visualised through social network learning analytics using Gephi. The first dataset visualised send/reply message networks between students and their teacher; the second dataset visualised networks of read messages; and the third dataset depicted the visualisation of posted messages networks. These datasets were visualised through ForceAtlas2, Fruchterman-Reingold and Radial Axis layouts. Additionally, social network analysis (SNA) parameters were calculated for each dataset in respect of both the complete network and each class. Moreover, students’ final continuous assessment grades were retrieved and integrated into node attributes. One instructive observation from this this study is that significant but low relationship was found between the number of times students read or replied to messages and academic performance. Among other things, this tends to emphasise one of the benefits Gephi has as a social network tool that can be used to visualise not only student participation but student engagement as well within OLEs (Hernández-García et al., 2016).

Case study: Department of English Studies at UNISA

According to the information posted on its website, the Department of English Studies (DEST) avers that it is “one of the largest departments in Africa” with its academic staff members specialising in various sub-fields related to the composite discipline of English Studies (Department of English Studies, 2017). This department is based at the University of South Africa’s (UNISA) main campus, in Muckleneuk, Pretoria, and offers open and distance e-learning to mostly part-time students in a blended mode under the aegis of UNISA as an ODeL institution. It falls under the School of Arts in the College of Human Sciences (CHS). One of its differentiating features is that it does not have its own undergraduate academic programme. As a result, it has several undergraduate module offerings that mainly cater for students enrolled for degree programmes offered by some of UNISA’s colleges (faculties). This means that at an undergraduate level, DEST primarily services certain departments and certain colleges at UNISA. However, the bulk of undergraduate students it services are those enrolled for a Bachelor of Education (B. Ed) degree. The latter is an undergraduate degree programme offered by the College of Education (CEDU).
Another differentiating feature of DEST is that it has large numbers of undergraduate students enrolled for its undergraduate modules, the majority of whom, as mentioned in the preceding paragraph, are from CEDU. One of the module that DEST offers to CEDU’s undergraduate students, which is part of their B. Ed degree programme (Foundation Phase), is Applied English Language Studies: Further Explorations (ENG2601). This is a two-semester, level-two (second-year) module one of whose outcomes is to analyse and interpret language structure as displayed in given texts selected from different genres (Chaka C, Nkhobo T, Lephalala M: Leveraging MoyaMA, WhatsApp and online discussion forum to support students at an open and distance e-learning university, submitted). This module is largely offered online even though at times some of its students are offered a face-to-face learning support by certain staff members from DEST. Most of the students offered such a face-to-face learning support are part of a learning support project called, Mathew Goniwe. However, this support programme happens only for a short period of time on designated days (especially Saturdays) in each semester. The main online delivery platform used for ENG2601, as is the case with other DEST’s undergraduate modules (and with other modules offered at UNISA), is myUnisa, which is a UNISA’s legacy LMS.
In view of the above, the current study reports on the social network visualisation of student engagement based on the login activity data of both ENG2601 students and a cohort of Mathew Goniwe students (drawn from ENG2601) in the first and second semesters of 2018. The data relate to these students’ use of three applications: myUnisa, Moya Messenger App (MoyaMA), and Flipgrid. Against this background, the study’s two purposes were to:
  • investigate module login data of undergraduate students and a cohort of Mathew Goniwe students enrolled for the module, Applied English Language Studies: Further Explorations (ENG2601 as extracted from myUnisa, MoyaMA, and Flipgrid, and as visualised by Gephi); and
  • establish the extent to which these visualised online login data serve as a proxy measure of student engagement.
Based on the above, the study had the following three research questions (RQs):
  • RQ1 = What do ENG2601 and Mathew Goniwe students’ myUnisa module file access frequency counts as represented in social network visualisations by Gephi tell about these students’ engagement with files uploaded on myUnisa?
  • RQ2 = What do frequency counts of Mathew Goniwe students’ and their facilitator’s instant messages (IMs) about module-level learning support activities as posted on MoyaMA as well as video clip creation and video clip view frequency counts posted on Flipgrid, as represented in social network visualisations by Gephi tell about these students’ engagement with the learning support activities?
  • RQ3 = To what extent do the three sets of frequency metrics, taken together as module login data, serve as a proxy measure of student engagement?

Methodology

Most studies conducted on both online student module data and online student module engagement employ mainly quantitative approaches such as surveys and quantitative metrics (e.g., login data and grades) as part of their overall methodology (see, for instance, Bahati, Fors, & Tedre, 2017; Dixson, 2015; Henrie, Bodily, et al., 2015; Henrie et al., 2018; McGarrigle, 2013; Vogt, 2016). For example, Henrie, Bodily, et al.’s (2015) review study of 113 student engagement studies reports that 39.8% of these studies used qualitative measures. In contrast, Hu and Li’s (2017) review of student engagement points out that although quantitative methods and mixed methods are employed in such studies, qualitative methods, especially self-reports and questionnaires, are used most frequently. The current study used a mixed-methods research (MMR) approach (Chen et al., 2018; Johnson, Onwuegbuzie, & Turner, 2007; Strudsholm, Meadows, Vollman, Thurston, & Henderson, 2016). This approach can blend methods from different paradigms or can utilise multiple methods that belong to the same paradigm, or different strategies within given methods. In this particular approach, methodological congruence can be achieved by ensuring that all aspects of the research design such as research questions, sampling, data collection, data analysis and rigour strategies fit together properly (Johnson et al., 2007; Strudsholm et al., 2016).
The discordant debates over what essentially constitutes MMR notwithstanding, this approach was employed in the present study to extract diverse sets of data from multiple sources (Osuna-Acedo & Gil-Quintana, 2017; Osuna-Acedo, Quintana, & Valero, 2017; Strudsholm et al., 2016) on the same phenomenon. In the case of this study, the phenomenon was online student login data related to an undergraduate English Studies module at an open and distance e-learning (ODeL) university in South Africa. In this regard, datasets were sourced from three tools (myUnisa, MoyaMA, and Flipgrid), and comprised student engagement frequencies embodied in module login data as represented by a social network visualisation tool, Gephi. Within this MMR approach, the study utilised a case study design to examine student login data (real-life phenomena) that occurred in virtual contexts (within the three applications). One of the objectives of a case study research design is to investigate current issues as embedded in real-life situations. Like an MMR approach, a case study research design requires data to be collected from multiple data sources in order to ensure data triangulation (Yin, 2018). As pointed out above, the present study extracted data from different sources with a view to triangulating its data.

Participants, sampling techniques and data collection

Participants for this study were selected in three stages. In the first stage, 27 ENG2601 students from 61 students participating in the Mathew Goniwe project were selected in June 2018 from myUnisa according to the frequencies by which they had accessed uploaded module files (resources). In this case, these students’ myUnisa ENG2601 module-level login activity data as captured by and stored on myUnisa in the first semester of 2018 were used as a guide to selecting them. The login activity details in question were extracted from a login database of 3475 ENG2601 students who had a myUnisa presence in this semester. Three categories were considered in selecting these students: highly active users (HAU sub-cohort); lowly active users (LAU sub-cohort); and non-participating (non-active) users (NP sub-cohort). Nine students were chosen in each category, and their myUnisa login module file access frequencies were extracted accordingly. All of this was done after participants had been informed about the study so as to secure their consent, and after ethical clearance had been obtained. To this end, the study used the ethical clearance granted to the Mathew Goniwe project by the College of Education Research Ethics Review Committee at the University of South Africa. In addition, students’ consent was secured to participate in the study. In the second stage, again 27 ENG2601 students from 61 students participating in the Mathew Goniwe project were selected from myUnisa in August 2018 in the second semester. Their selection process followed the same procedure as stipulated above. However, in this semester, there were 2954 ENG2601 students who had login activity details on myUnisa.
In the third stage, 27 students, who participated in the Mathew Goniwe project in the second semester (2018), were requested to take part in 2 learning support activities (2 short excerpts based on a compare and contrast essay writing) that were presented to them through MoyaMA and Flipgrid (see Fig. 1) between 27 August 2018 and 01 September 2018. MoyaMA is a data-free South African-based mobile messaging application, while Flipgrid is an online educational video discussion platform owned by Microsoft (Chaka et al., 2019 n.d.).
In all, the data collected for this study comprised six datasets. The first dataset consisted of student login data collected from myUnisa in the first stage of the data collection. The student login data were made up of files accessed by 3475 ENG2601 students and the frequencies at which these students had accessed these files as stored on myUnisa in the first semester of 2018. Accessed files consisted of documents such as May/June Exam Paper 2016 (MJP); October/November Exam Paper 2016 (ONP16); ENG2601 2018 Tutorial Letter 202 (ATL202); Study at Unisa2018pdf (SAUN); Announcements: Read this before calling_April 2018 (ARTBC); ENG2601 Activities Feedback May/June 2016 (AFB); and Feedback Assignment 2 Semester 1 (FBLTA2). All these documents had been uploaded on myUnisa during the course of the 2018 first semester. The second dataset was made up of myUnisa login activities of the Mathew Goniwe 27 students in relation to the same files.
The third dataset contained student login data collected from myUnisa during the second stage of the data collection. These student login data comprised one uploaded file accessed by 2954 ENG2601 students from myUnisa in the second semester of 2018 and the number of times these students had accessed it. This was file was referred to as MAF. The fourth dataset was composed of myUnisa login activities of the 27 Mathew Goniwe students who had accessed the same file.
In contrast, the fifth dataset was sourced from 12 students from the 27 Matthew Goniwe (and from their facilitator) who were offered 2 learning support activities (2 short extracts on a compare and contrast essay writing) on MoyaMA. It was made up of the exchange of instant messages (IMs) between students and their facilitator on Moya™. Students had to comment about the two extracts by sending IMs to the facilitator and to each other. Lastly, the sixth dataset was also sourced from the same 12 students who were provided learning support on Flipgrid. It consisted of the number of video clips created and the number of video views extracted from this video discussion forum application between 27 August 2018 and 01 September 2018. Here, students had to create a one-minute video clip in which they provided a short oral commentary of their personal experiences of having participated in the MoyaMA activities, and post it on Flipgrip for other students to view.

Data analysis

All the six datasets - the four datasets extracted from myUnisa, and the one dataset extracted from MoyaMA and Flipgrid, respectively - were subjected to a social network analysis (SNA) on Gephi. Gephi is an open-source software programme that plots visual representations of social networks through maps and graphs. It can also calculate SNA parameters and the ranking, partitioning and filtering of nodes and ties (Bozkurt et al., 2016; Hernández-García et al., 2016).
In SNA each actor is regarded as a node or a vertex in the overall network, and the various relationships occurring between actors are links, ties or edges connecting actors. SNA provides individual centrality measures used to determine patterns of interaction in a given SNA. These include in-degree and out-degree centrality, degree centrality, betweenness centrality, hubs and authorities. The first two measures, respectively, have to do with signalling the number of incoming and outgoing links attached to a node. Degree centrality is about the total nodes connected to a given node within the network, and signals the influence a node has vis-à-vis other nodes. Betweenness centrality relates to nodes linking groups of nodes, or bridging various nodes. Hubs are nodes whose outgoing links connect them to nodes having a large number of incoming links, whereas nodes are authorities when their incoming ties connect them to nodes with a large number of outgoing ties. Two other SNA measures are network density and network diameter. The former, also known as connectedness, refers to the ratio of the actual number of ties and the maximum possible number of ties relative to the number of nodes in a network. It can be well connected (dense) or sparsely connected (less dense). The latter relates to the largest number of nodes that must be crossed in order to move from one node to another (Bayat, 2013; Garcia-Saiz, Palazuelos, & Zorrilla, 2013; Hernández-García et al., 2016; Liu, Chen, & Tai, 2017; Rosen, Miagkikh, & Suthers, 2011). It represents the distance that has to be traversed between the two farthest nodes as opposed to the geodesic – the shortest path between two nodes (Bayat, 2013).

myUnisa module-level login data analysis

Student module-level login activities as contained in the four datasets collected in the first and second semesters of 2018 (as mentioned above) were manually extracted from myUnisa, slightly formatted, and imported into an MS Excel spreadsheet file. They were coded and arranged into columns representing student activities on myUnisa. The names of the uploaded files which students accessed in the first semester were abbreviated arbitrarily for differentiation purposes as indicated above (e.g., May/June Exam Paper 2016 (MJP)). The following user or student identification categories for the Mathew Goniwe students were created and abbreviated accordingly: highly active users (HAU sub-cohort); lowly active users (LAU sub-cohort); and non-participating users (NP sub-cohort). Students as nodes were identified by an alphabetical letter followed by three numeric digits. The MS Excel file containing all the reformatted and coded login data together with the abbreviated labels were saved and imported into Gephi. Thereafter, social network visualisations were generated from Gephi in the form of static sociograms using customised layouts. Pertaining to the second semester student login activities of the file students accessed during this period, the same procedure was followed except that there was only one file accessed as the semester had just started.

Moya MA and Flipgrid module-level login data analysis

The same data analytic procedure outlined above was followed in respect of student module-level login activities for Moya MA and Flipgrid datasets. For the MoyaMA dataset, a facilitator was arbitrarily identified as SFAC. For both datasets, students as participants were arbitrarily identified as: R; AZW4; B; TWO; M; P; S; F; T2; T3; T4; and T5 (see Figs. 4 and 5). The contents of each dataset were manually extracted from each tool, slightly formatted, and imported into and saved in two separate MS Excel spreadsheet files. The latter were transferred to Gephi from which their respective social network visualisations were generated in the form of static sociograms (see Figs. 4 and 5).

Findings

The findings presented in this section are depicted through static visual social network sociograms of login activities contained in the six datasets described above.

Dataset#1 and dataset#2: first semester myUnisa login file access frequencies

As depicted in Fig. 2 (also see Table 1), there were six accessed files in this semester. Of these, the most accessed file (MAF) by ENG2601 students who had a presence on myUnisa during this semester, was Activities Feedback (AFB), which had been accessed 1676 times. The second and the third most accessed files were May/June Paper 2016 (MJP16) and October/November Exam Paper 2016 (ONP16), which had access frequency counts of 1123 and 780, respectively. The two least accessed files were Announcements: Read this before calling_April 2018 (ARTBC) and Feedback Assignment 2 Semester 1 (FBLTA2), each of which had been accessed 38 and 19 times.
Table 1
ENG2601 student module-level file access frequencies on myUnisa and 27 participants’ module-level file access frequencies in the 1st semester of 2018
MAF
FREQA
HAU
FREQB
LAU
FREQC
NP
FREQD
AFB
1676
M235
106
M623
1
A543
0
MJP16
1123
S507
93
K471
1
A079
0
ONP16
780
M387
90
S372
1
A486
0
ATL202
86
G872
84
Z549
1
A307
0
SAUN
86
M571
76
H292
1
A365
0
ARTBC
38
N475
62
S972
1
A109
0
FBLTA2
19
N595
58
M303
1
A343
0
  
E019
58
B280
1
A351
0
  
M910
57
B751
1
A306
0
In the same academic semester, and again as shown by both Fig. 2 and Table 1, the 27 Mathew Goniwe students who were classified into three categories, highly active users (HAU sub-cohort), lowly active users (LAU sub-cohort), and non-participating (non-active) users (NP sub-cohort), were tracked down in terms of the frequencies at which they had accessed these six files. For example, the total frequency for accessing the six files by HAU and LAU sub-cohorts was 635. To this effect, of the HAU sub-cohort, three users (M235, S507 and M387) produced the three highest frequency counts of 106, 93 and 90, respectively. In contrast, three other users from the same category (N475, N595 and M910) generated the three lowest frequency counts of 62, 57 and 58, apiece. The other file download frequency counts attributable to the remaining users in this cohort fell between the three highest and the three lowest frequency counts as displayed on this figure. As regards the other two categories of users (LAU and NP sub-cohorts), each user in each of these two sub-cohorts generated one file access frequency and zero file access frequency, apiece, during the academic semester under consideration.

Dataset#3 and dataset#4: second semester myUnisa login file access frequencies

In the second semester of 2018, there was one uploaded file (AFB) that all ENG2601 students could access from myUnisa at the time the study was conducted. The access frequencies of this file are as displayed on Fig. 3.
At this point, the file itself had been accessed 327 times. Of the 27 Mathew Goniwe students, the total frequency recorded by the HAU sub-cohort was 79. In this case, two users (M742 and M842) generated the two highest frequency counts of 22 and 15, apiece, while other two users had frequency counts of 11 and 10, each. By contrast, two pairs of users, W366 and D409 on the one hand, and N227 and V921 on the other hand, shared the two lowest frequency counts of 6 and 7, respectively. The other user produced the frequency count of 8. With respect to the other two categories of users (LAU and NP), each user in each of these two sub-cohorts recorded 1 access frequency and zero access frequency of the uploaded file, apiece, during this period.

Dataset#5 and dataset#6: instant messages and video clip views

With reference to MoyaMA login metrics (see Fig. 4), the 12 participants and their facilitator (SFAC) exchanged 172 instant messages (IMs) during the learning support period. Of these IMs, 72 came from SFAC, while the rest were sent by participants. The participant with the most sent IMs was R (26 IMs,) whereas AZW4 and B were tied at 23 IMs, each. Two participants (TWO and M) also had a tie of 13 IMs, apiece. The same applies to participant P and participant S: they each recorded a tie of one IM. The remaining 5 participants did not post send any IM.
Finally, as depicted in Fig. 5, concerning the Flipgrid activity log, there was one video clip created by one participant (B). The video clip had 6 views, all of which were attributable to one participant (AZW4).

Discussion

This part discusses the findings as presented in the preceding section. The discussion section itself is divided into two sub-sections: visualising myUnisa login file access metrics and visualising metrics of both MoyaMA learning support instant messages and Flipgrid video clip views.

Visualising myUnisa login file access metrics

As mentioned earlier and as illustrated by Fig. 2, three ENG2601 files with the most access metrics on the myUnisa LMS in the first semester of 2018 were AFB, MJP16 and ONP16, respectively. The first file was about the feedback on activities (tasks) based on the module, while the other two files were copies of the 2016 first and second semester examination question papers. These were, according to students’ judgement as based on the provided access metrics, the critical files. They are files that students preferred to access, view and read over other uploaded files such as ENG2601 2018 Tutorial Letter 202 (ATL202), Announcements: Read this before calling_April 2018 (ARTBC) and Feedback Assignment 2 Semester 1 (FBLTA2). The last three files were also the least accessed by students. While ARTBC was for announcement purposes, it is surprising that the other two files that carry almost the same significance as AFB, which attracted the highest file access metrics, were among the least accessed files. This is even more so since for UNISA’s open and distance e-learning (ODeL) students such as ENG2601 students, feedback, especially module content feedback, plays a crucial role in their daily learning endeavours. Given the huge file access frequency differential between the most accessed file and the least accessed file, it becomes clear that students tend to use their discretion in determining which files they deem more important, and which files they deem less important. To this end, Rust (2002) contends that students are strategic in approaching what they learn and that they tend to consciously engage with learning content tasks that have relevance for their own assessment grades.
Visualising such myUnisa file access frequencies through sociograms as is the case here offers an enriched insight into investigating ENG2601 student file access behaviour. ENG2601 students’ myUnisa module-level login file access metrics as visualised by Gephi represent their engagement - especially behavioural engagement as a component of a multi-dimensional view of student engagement - with their module, Applied English Language Studies: Further Explorations (ENG2601). That is, these metrics serve as a proxy measure of these students’ engagement with this module (Badge et al., 2012; Beer et al., 2010; Boulton et al., 2018; Hernández-García et al., 2016; Hussain et al., 2018). In this regard, Badge et al. (2012) point out that in their study that used Friendfeed, network data extracted and visually represented using Gephi offered an uncomplicated but effective proxy for student engagement. Similarly, in this study, student module-level login file access frequencies tend to point to student engagement with specific files related to ENG2601. This type of student engagement is what Dixson (2015) characterises as observational learning behaviour. Conceptualised in this way, student engagement was higher in the three module files specified above, while it was lower in the other three aforesaid module files. Most importantly, student engagement was disproportionally skewed toward one module file (AFB) vis-à-vis the other module files (see Fig. 2). In this sense, this module file appears to have had a stronger pull factor than the other module files in this semester.
As illustrated by Fig. 3 and again as outlined earlier, in the second academic semester there was one file (AFB) that had been uploaded on myUnisa when the study was undertaken. At this time, the file had been accessed 327 times. This is the same file that students accessed the most in the first semester. This file together with its metric, in turn, serves as a proxy measure through which student engagement for ENG2601 students can be visualised and mapped. At this early time in the second semester, this module file appears to be already displaying its pull factor.
Pertaining to the 27 Mathew Goniwe students whose first semester module file access frequencies were extracted from myUnisa, students in the HAU sub-cohort recorded the highest file frequency counts compared to the other two sub-cohorts (see Fig. 2). Not only were the module file access frequency variations between this cohort and the other two cohorts huge, but the overall module file access metrics were disproportionally concentrated in this sub-cohort. The disproportional module file access metrics as visualised in a sociogram in Fig. 2 and as represented in Table 1 are more telling when considering that each participant in the LAU sub-cohort and each participant in the NP sub-cohort generated, respectively, one file access and zero file access.
As the sociogram in Fig. 2 represents each participant’s interaction with the uploaded ENG2601 module files, and not the interaction with each other or with a facilitator, the visualised nodes and their attendant links depict only out-degree links and no other centrality measures. As a proxy measure of student engagement, this visual mapping of the three sub-cohorts’ module file access metrics shows that the HAU sub-cohort engaged with ENG2601 module-level files the most. With the three highest file access frequency counts shared by three participants in the HAU sub-cohort, these three participants also shared almost half the student engagement metrics when compared to the other participants in this sub-cohort. In relation to using access to LMS activities, which in this case are module-level file access frequencies, as a measurement of student engagement, Hussain et al. (2018) contend that in online learning environments (OLEs), it is not easy to capture a student’s engagement through conventional methodologies like class attendance as such methodologies are not directly available on LMSs. In a different but related context, Beer et al. (2010) and Holmes (2018) point out that monitoring and capturing student interaction with learning materials on LMSs is one mode of tracing student engagement. The same can be said about the current study as it utilised students’ accessing of a learning material (a module) as a proxy indicator of student engagement. In this case, Vogt (2016) maintains that it can be argued that activities occurring in an LMS may be used as a measure of student engagement.

Visualising MoyaMA module-level IM and Flipgrid video clip view metrics

As highlighted under the data analysis section, the 12 participants who were offered learning support on MoyaMA by their facilitator, all together generated, in conjunction with their facilitator, 172 IMs. There are basic centrality measures discernible from the visualisation of the IM exchanges as depicted in Fig. 4. For example, the facilitator (SFAC) posted a high number of the IMs generated, most of which (67) were stand-alone IMs on some aspects of the module content. These were IMs not intended to respond to any particular participant’s IM, and as such, they had no links or ties associated to any node. Nor did they had any in-degree and out-degree centralities. This illustrates the dominance the facilitator had in this module content interaction vis-à-vis the 12 participants. In this case, as a node, the facilitator operated as both an authority node and a hub node (Garcia-Saiz et al., 2013; Hernández-García et al., 2016). That is, the ties linked to the facilitator’s IMs represent the level of influence he leveraged over the flow of information and the position of prestige he held (Skrypnyk, Joksimović, Kovanović, Gašević, & Dawson, 2015) in this interaction. Of the participants who posted IMs, participants R, AZW4 and B posted most IMs. Not only did these participants share an IM, but they also had three other participants sharing the same IM with them. Among the 12 participants, these 3 participants’ nodes served as authority and hub nodes, beside the facilitator’s node. By contrast, 5 participants did not post any IMs. So, IMs posted on MoyaMA by active participants as visualised through nodes in the sociogram displayed in Fig. 4 serve, in this case, as instances of student participation. As such, they can be regarded as a proxy measure of these participants’ engagement with ENG2601 learning support activities offered on MoyaMA as an instant mobile platform. In the same vein, the non-participation of the other 5 students in these learning support activities can be seen as a lack of student engagement on their part.
In connection with Flipgrid, one participant created a video clip related to the ENG2601 module content. This video clip generated 6 views by one participant. In this sense, this participant’s viewing of this video clip represents her being engrossed with its content. This act represents her engagement with the video clip content and, in a way, serves as a proxy indicator of her student engagement in this regard. Elsewhere, Green and Green (2017) argue that as a tool, Flipgrid enables students to engage with learning content through aural-visual means. Conversely, for participants who did not view the video clip, their non-viewing of it can be equated with their non-engagement with it.

Conclusion and future research

This study had two related purposes. Firstly, it set out to investigate module login data of undergraduate students and a cohort of Mathew Goniwe students enrolled for the module, Applied English Language Studies: Further Explorations (ENG2601 as extracted from myUnisa, MoyaMA, and Flipgrid, and as visualised by Gephi). Secondly, it sought to establish the extent to which these visualised online login data serve as a proxy measure of student engagement. These two purposes were underscored by three research questions as spelt out earlier. On the one hand, the findings of this study provide a visual social network representation of the myUnisa login file access frequencies generated by ENG2601 students as a consolidated cohort, and by Mathew Goniwe students as a designated cohort in the first semester of 2018, and for a portion of the second semester of 2018. In the first semester, there were three files which all ENG2601 students had accessed the most among those that had been uploaded on myUnisa. While the first of these files was a module activities feedback file, the other two were copies of two previous examination question papers.
This login action highlights these students’ proclivity for accessing module files that have the potential to help them achieve a desired module grade (Agudo-Peregrina et al., 2014; Holmes, 2018) as opposed to files that simply contain the required module content. This disposition is also manifest in the module activities feedback file – albeit the only uploaded one at that stage - that students had already accessed 327 times when the second semester had just started. Based on this, it is plausible to argue that these students’ access frequency patterns of the said files are a proxy measure of their engagement with the given module files (Beer et al., 2010; Holmes, 2018; Vogt, 2016). All this implies that academics or lecturers need to know in advance which uploaded module-level files students are likely to gravitate towards in terms of accessing (and viewing and reading). This prior knowledge of students’ preferences should inform the choice and the types of module-level files to be uploaded on myUnisa (or on any other related LMSs) for students. Additionally, these students’ module-level file access frequency behaviour reflects their observational learning behaviour (Dixson, 2015).
On the other hand, the findings of this study also offer a visual social network representation of ENG2601 login activities of Mathew Goniwe students on both MoyaMA and Flipgrid for a specified stint in the second semester of 2018. In the former instance, such login activities included those of the facilitator (SFAC). The overall social network visualisation of the IMs generated on MoyaMA as a mobile instant messaging platform revealed disproportionate IM frequencies between the IMs posted by the facilitator and those posted by participants. The former posted more module-level IMs and, thus, dominated both the participation in and the engagement with the information related to learning support module activities presented on this platform. Consequently, his node operated as both an authority node and a hub node. In respect of participants, 3 of them posted the most IMs, with their nodes emerging as authority and hub nodes, while 5 participants did not post any IM. This means that these IMs as configured through a social network visualisation tool such as Gephi served as a proxy indicator of facilitator engagement and student engagement with ENG2601 learning support activities mediated through MoyaMA. In contrast to myUnisa login activities, this type of student engagement instantiates an application learning behaviour as it entailed students posting IMs on MoyaMA (Dixson, 2015). Pertaining to the 5 students who did post any IM, their non-participation can be viewed as a proxy for their non-engagement with these learning support activities, and as the absence of application learning behaviour on their part.
The implication of all this is that, learning facilitators, especially ENG2601 module facilitators, should know the extent to which they can participate in a learning process, so as to decrease or increase both their module-level participation and engagement accordingly. They also need to know that there are students who are likely to engage more with module-level activities; who are likely to engage less with module-level activities, and who are likely to have no student engagement with any given module-level activities.
With reference to Flipgrid, the sole video clip created by one participant and viewed 6 times by one other participant goes beyond just serving as a proxy measure of student engagement in this particular instance. It implies, particularly in the context of this study, that student engagement varies with and is determined by both given module-level activities and by given delivery platforms. In this case, the module-level activity that students were required to engage in (creating a one-minute video clip in which they reflected on the activities presented on MoyaMA) could have been too daunting a task for the majority of them to carry out. In addition, Flipgrid as a video discussion forum platform could have been a too unfamiliar learning space for the bulk of students to participate on. This, therefore, underscores the prudence module facilitators need to exercise in selecting and determining the nature of module-level activities in which students are expected to participate, and in selecting and determining the types of teaching and learning delivery platforms on which students are required to participate. Overall, this study has demonstrated and argued how myUnisa login data (especially module-level file access frequencies) by ENG2601 students and a cohort of ENG2601 Mathew Goniwe students can serve as a proxy measure of student engagement. In the same vein, it has shown and highlighted how MoyaMA and Flipgrid login data (in the form of both IMs and a video clip together with video clip views) by a cohort of 2601 Mathew Goniwe students can serve as a proxy measure of student engagement. Of course, this does not suggest that module facilitators or academics in general should not trial different platforms or tools meant for teaching and learning purposes.
Finally, the study has limitations. Firstly, the study was undertaken as a case study. As is standard research practice with a lot of case studies, the findings of the current study are not generalisable; nor are they reproducible. However, some aspects of it such as combining legacy LMSs (e.g., myUnisa) with stand-alone tools like MoyaMA and Flipgrid in investigating student engagement, have an element of transferability that may be relevant to other studies with a similar focus. In this regard, the study’s focal point, gauging student engagement through module-level file access frequencies, IM frequencies, and video clip creation and video clip view frequencies, can be employed by other studies to establish whether or not such module-level metrics serve as a proxy measure of student engagement.
Secondly, the study employed file access metrics related to one module to determine module-level student engagement. Future studies could consider using more than one module to determine student engagement. Thirdly, the study has employed simplified social network visualisations to represent its data. Nonetheless, students’ file access frequencies in the module investigated were as they have been visually represented. As such, they needed no complex visual representations. Fourthly, and to conclude, future studies may consider using other student login activities than the ones utilised in the current study to determine online student engagement.

Acknowledgements

The authors would like to thank a graduate student who worked with one of the authors to compute and visualise datasets on Gephi. Moreover, they would like to express their gratitude to ENG2601 team members who allowed them to have access to the module information.
NA.

Competing interests

The authors declare that they have no competing interests.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Literature
go back to reference Agudo-Peregrina, Á. F., Iglesias-Pradas, S., Conde-González, M. Á., & Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31, 542–550. https://doi.org/10.1016/j.chb.2013.05.031.CrossRef Agudo-Peregrina, Á. F., Iglesias-Pradas, S., Conde-González, M. Á., & Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31, 542–550. https://​doi.​org/​10.​1016/​j.​chb.​2013.​05.​031.CrossRef
go back to reference Bozkurt, A., Honeychurch, S., Caines, A., Bali, M., Koutropoulos, A., & Cormier, D. (2016). Community tracking in a cMooc and nomadic learner behavior identification on a connectivist rhizomatic learning network. Turkish Online Journal of Distance Education, 17(4), 4–30. https://doi.org/10.17718/tojde.09231.CrossRef Bozkurt, A., Honeychurch, S., Caines, A., Bali, M., Koutropoulos, A., & Cormier, D. (2016). Community tracking in a cMooc and nomadic learner behavior identification on a connectivist rhizomatic learning network. Turkish Online Journal of Distance Education, 17(4), 4–30. https://​doi.​org/​10.​17718/​tojde.​09231.CrossRef
go back to reference Chaka, C., & Nkhobo, T. (2019). Online module login data as a proxy measure of student engagement: the case of myUnisa, MoyaMA, Flipgrid, and Gephi at an ODeL institution in South Africa. International Journal of Educational Technology in Higher Education. https://doi.org/10.1186/s41239-019-0167-9. Chaka, C., & Nkhobo, T. (2019). Online module login data as a proxy measure of student engagement: the case of myUnisa, MoyaMA, Flipgrid, and Gephi at an ODeL institution in South Africa. International Journal of Educational Technology in Higher Education. https://​doi.​org/​10.​1186/​s41239-019-0167-9.
go back to reference Christenson, S. L., Reschly, A. L., & Wylie, C. (2012). Preface. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement, (pp. v–ix). New York: Springer.CrossRef Christenson, S. L., Reschly, A. L., & Wylie, C. (2012). Preface. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement, (pp. v–ix). New York: Springer.CrossRef
go back to reference Conde, M. Á., Hérnandez-García, Á., García-Peñalvo, F. J., & Séin-Echaluce, M. L. (2015). Exploring student interactions: Learning analytics tools for student tracking. In P. Zaphiris, & A. Ioannou (Eds.), Learning and collaboration technologies, (pp. 50–61). Cham: Springer.CrossRef Conde, M. Á., Hérnandez-García, Á., García-Peñalvo, F. J., & Séin-Echaluce, M. L. (2015). Exploring student interactions: Learning analytics tools for student tracking. In P. Zaphiris, & A. Ioannou (Eds.), Learning and collaboration technologies, (pp. 50–61). Cham: Springer.CrossRef
go back to reference Hernández-García, Á., & Suárez-Navas, I. (2017). GraphFES: A web service and application for Moodle message board social graph extraction. In B. K. Daniel (Ed.), Big data and learning analytics in higher education: Current theory and practice, (pp. 167–194). Cham: Springer. https?CrossRef Hernández-García, Á., & Suárez-Navas, I. (2017). GraphFES: A web service and application for Moodle message board social graph extraction. In B. K. Daniel (Ed.), Big data and learning analytics in higher education: Current theory and practice, (pp. 167–194). Cham: Springer. https?CrossRef
go back to reference Osuna-Acedo, S., Quintana, J. G., & Valero, C. C. (2017). Open, mobile and collaborative educational experience. Case study: The European ECO project. Journal of Universal Computer Science, 23(12), 1215–1237. Osuna-Acedo, S., Quintana, J. G., & Valero, C. C. (2017). Open, mobile and collaborative educational experience. Case study: The European ECO project. Journal of Universal Computer Science, 23(12), 1215–1237.
go back to reference Poon, L. K. M., Kong, S.-C., Yau, T. S. H., Wong, M., & Ling, M. H. (2017). Learning analytics for monitoring students participation online: Visualizing navigational patterns on learning management system. In S. Cheung, L. Kwok, W. Ma, L. K. Lee, & H. Yang (Eds.), Blended learning: New challenges and innovative practicesLecture Notes in Computer Science, vol 10309. Cham: Springer. https://doi.org/10.1007/978-3-319-59360-9_15.CrossRef Poon, L. K. M., Kong, S.-C., Yau, T. S. H., Wong, M., & Ling, M. H. (2017). Learning analytics for monitoring students participation online: Visualizing navigational patterns on learning management system. In S. Cheung, L. Kwok, W. Ma, L. K. Lee, & H. Yang (Eds.), Blended learning: New challenges and innovative practicesLecture Notes in Computer Science, vol 10309. Cham: Springer. https://​doi.​org/​10.​1007/​978-3-319-59360-9_​15.CrossRef
go back to reference Rosen, D., Miagkikh, V., & Suthers, D. (2011). Social and semantic network analysis of chat logs. In LAK ‘11 proceedings of the 1 st international conference on learning analytics and knowledge, (pp. 134–139). New York: Association for Computing Machinery (ACM). https://doi.org/10.1145/2090116.2090137.CrossRef Rosen, D., Miagkikh, V., & Suthers, D. (2011). Social and semantic network analysis of chat logs. In LAK ‘11 proceedings of the 1 st international conference on learning analytics and knowledge, (pp. 134–139). New York: Association for Computing Machinery (ACM). https://​doi.​org/​10.​1145/​2090116.​2090137.CrossRef
go back to reference Yin, R. K. (2018). Case study research and applications: Design and methods. Thousand Oaks: Sage. Yin, R. K. (2018). Case study research and applications: Design and methods. Thousand Oaks: Sage.
Metadata
Title
Online module login data as a proxy measure of student engagement: the case of myUnisa, MoyaMA, Flipgrid, and Gephi at an ODeL institution in South Africa
Authors
Chaka Chaka
Tlatso Nkhobo
Publication date
01-12-2019
Publisher
Springer International Publishing
DOI
https://doi.org/10.1186/s41239-019-0167-9

Other articles of this Issue 1/2019

International Journal of Educational Technology in Higher Education 1/2019 Go to the issue

Premium Partner