Skip to main content

2017 | Buch

Learning Analytics: Fundaments, Applications, and Trends

A View of the Current State of the Art to Enhance e-Learning

insite
SUCHEN

Über dieses Buch

This book provides a conceptual and empirical perspective on learning analytics, its goal being to disseminate the core concepts, research, and outcomes of this emergent field. Divided into nine chapters, it offers reviews oriented on selected topics, recent advances, and innovative applications. It presents the broad learning analytics landscape and in-depth studies on higher education, adaptive assessment, teaching and learning. In addition, it discusses valuable approaches to coping with personalization and huge data, as well as conceptual topics and specialized applications that have shaped the current state of the art.

By identifying fundamentals, highlighting applications, and pointing out current trends, the book offers an essential overview of learning analytics to enhance learning achievement in diverse educational settings. As such, it represents a valuable resource for researchers, practitioners, and students interested in updating their knowledge and finding inspirations for their future work.

Inhaltsverzeichnis

Frontmatter
Chapter 1. Learning Analytics in Higher Education—A Literature Review
Abstract
This chapter looks into examining research studies of the last five years and presents the state of the art of Learning Analytics (LA) in the Higher Education (HE) arena. Therefore, we used mixed-method analysis and searched through three popular libraries, including the Learning Analytics and Knowledge (LAK) conference, the SpringerLink, and the Web of Science (WOS) databases. We deeply examined a total of 101 papers during our study. Thereby, we are able to present an overview of the different techniques used by the studies and their associated projects. To gain insights into the trend direction of the different projects, we clustered the publications into their stakeholders. Finally, we tackled the limitations of those studies and discussed the most promising future lines and challenges. We believe the results of this review may assist universities to launch their own LA projects or improve existing ones.
Philipp Leitner, Mohammad Khalil, Martin Ebner
Chapter 2. Teaching and Learning Analytics to Support Teacher Inquiry: A Systematic Literature Review
Abstract
Teacher inquiry is identified as a key global need for driving the continuous improvement of the teaching and learning conditions for learners. However, specific barriers (mainly related to teachers’ data literacy competences), can defer teachers from engaging with inquiry to improve their teaching practice. To alleviate these barriers and support teacher inquiry, the concept of Teaching and Learning Analytics (TLA) has been proposed, as a complementing synergy between Teaching Analytics and Learning Analytics. Teaching and Learning Analytics aims to provide a framework in which the insights generated by Learning Analytics methods and tools can become meaningfully translated for driving teachers’ inquiry to improve their teaching practice, captured through Teaching Analytics methods and tools. In this context, TLA have been identified as a research challenge with significant practical impact potential. This chapter contributes the first systematic literature review in the emerging research field of Teaching and Learning Analytics. The insights gained from the systematic literature review aim to (a) transparently outline the existing state-of-the-art following a structured analysis methodology, as well as (b) elicit insights and shortcomings which could inform future work in the Teaching and Learning Analytics research field.
Stylianos Sergis, Demetrios G. Sampson
Chapter 3. A Landscape of Learning Analytics: An Exercise to Highlight the Nature of an Emergent Field
Abstract
Before the increasing efforts for understanding, predicting, and enhancing students’ learning in educational settings, learning analytics (LA) emerges as a candidate research area to tackle such issues. Thus, several work lines have been conducted, as well as diverse conceptual and theoretical perspectives have been arisen. Moreover, quite interesting and useful outcomes have been produced during the LA short lifetime. However, a clear idea of diverse questions is still pending to be given. (e.g., what does learning analytics mean? what are its backgrounds, related domains, and underlying elements? which are the objects of its applications? and what about the trends and challenges to be considered?) This is the reason why the chapter aims at responding those concerns by a sketch of a conceptual scenery that explains the LA background, its underlying domains and nature, including a survey of recent and relevant approaches, and a relation of risks and opportunities.
Alejandro Peña-Ayala, Leonor Adriana Cárdenas-Robledo, Humberto Sossa
Chapter 4. A Review of Recent Advances in Adaptive Assessment
Abstract
Computerized assessments are an increasingly popular way to evaluate students. They need to be optimized so that students can receive an accurate evaluation in as little time as possible. Such optimization is possible through learning analytics and computerized adaptive tests (CATs): the next question is then chosen according to the previous responses of the student, thereby making assessment more efficient. Using the data collected from previous students in non-adaptive tests, it is thus possible to provide formative adaptive tests to new students by telling them what to do next. This chapter reviews several models of CATs found in various fields, together with their main characteristics. We then compare these models empirically on real data. We conclude with a discussion of future research directions for computerized assessments.
Jill-Jênn Vie, Fabrice Popineau, Éric Bruillard, Yolaine Bourda
Chapter 5. Data-Driven Personalization of Student Learning Support in Higher Education
Abstract
Despite the explosion of interest in big data in higher education and the ensuing rush for catch-all predictive algorithms, there has been relatively little focus on the pedagogical and pastoral contexts of learning. The provision of personalized feedback and support to students is often generalized and decontextualized, and examples of systems that enable contextualized support are notably absent from the learning analytics landscape. In this chapter we discuss the design and deployment of the Student Relationship Engagement System (SRES), a learning analytics system that is grounded primarily within the unique contexts of individual courses. The SRES, currently in use by teachers from 19 departments, takes a holistic and more human-centric view of data—one that puts the relationship between teacher and student at the center. Our approach means that teachers’ pedagogical expertise in recognizing meaningful data, identifying subgroups of students for a range of support actions, and designing and deploying these actions, is facilitated by a customizable technology platform. We describe a case study of the application of this human-centric approach to learning analytics, including its impacts on improving student engagement and outcomes, and debate the cultural, pedagogical, and technical aspects of learning analytics implementation.
Danny Yen-Ting Liu, Kathryn Bartimote-Aufflick, Abelardo Pardo, Adam J. Bridgeman
Chapter 6. Overcoming the MOOC Data Deluge with Learning Analytic Dashboards
Abstract
With the proliferation of MOOCs and the large amount of data collected, a lot of questions have been asked about their value and effectiveness. One of the key issues emerging is the difficulty in the sense—making from the data available. The use of analytic dashboards has been suggested to provide quick insights and distil the large volume of learner interaction data generated. These dashboards hold the promise of providing a contextualized view of data and facilitating useful research exploration. However, little has been done in defining how these dashboards should be created, often resulting in a proliferation of systems for each new research agenda. We present our experience of building MOOC dashboards for two different platforms, namely Coursera and FutureLearn, motivated by a set of design goals with input from a diverse set of stakeholders. We demonstrate the features of the system and how it has served to make data accessible and useable. We report on problems faced, drawing on analyses of think-aloud sessions conducted with real educators, which have informed our dashboard process.
Lorenzo Vigentini, Andrew Clayphan, Xia Zhang, Mahsa Chitsaz
Chapter 7. A Priori Knowledge in Learning Analytics
Abstract
Learning Analytics (LA) can be data driven: the process is oriented essentially by data and not according to a theoretical background. In this case, results can be, sometimes, not exploitable. This is the reason why some LA processes are theory driven: based on A Priori Knowledge (APK), on a theoretical background. Here, we investigate the relationship between APK and LA. We propose a “2-level framework” that considers LA as a level 2 learning process and includes five components: stakeholders, goals, data, technical approaches and feedbacks. Based on this framework, a sample of LA related works is analyzed to exhibit how such works relate LA with APK. We show that most of the time the APK used for LA is the learning theory sustaining the student’s learning. However, it can be otherwise and, according to the goal of LA, it is sometimes fruitful to use another theory.
Jean Simon
Chapter 8. Knowledge Discovery from the Programme for International Student Assessment
Abstract
The Programme for International Student Assessment (PISA) is a worldwide study that assesses the proficiencies of 15-year-old students in reading, mathematics, and science every three years. Despite the high quality and open availability of the PISA data sets, which call for big data learning analytics, academic research using this rich and carefully collected data is surprisingly sparse. Our research contributes to reducing this deficit by discovering novel knowledge from the PISA through the development and use of appropriate methods. Since Finland has been the country of most international interest in the PISA assessment, a relevant review of the Finnish educational system is provided. This chapter also gives a background on learning analytics and presents findings from a novel case study. Similar to the existing literature on learning analytics, the empirical part is based on a student model; however, unlike in the previous literature, our model represents a profile of a national student population. We compare Finland to other countries by hierarchically clustering these student profiles from all the countries that participated in the latest assessment and validating the results through statistical testing. Finally, an evaluation and interpretation of the variables that explain the differences between the students in Finland and those of the remaining PISA countries is presented. Based on our analysis, we conclude that, in global terms, learning time and good student-teacher relations are not as important as collaborative skills and humility to explain students’ success in the PISA test.
Mirka Saarela, Tommi Kärkkäinen
Chapter 9. A Learning Analytics Approach for Job Scheduling on Cloud Servers
Abstract
Learning analytics improves the teaching and learning procedures by using the educational data. It uses analysis tools to carry out the statistical evaluation of rich data and the pattern recognition within data. This chapter, firstly, describes four learning analytics methods in educational institutions. Secondly, it proposes a learning analytics approach for job scheduling on cloud servers, called LAJOS. This approach applies a learning-based mechanism to prioritise users’ jobs on scheduling queues. It uses the three basic attributes “importance level”, “waiting time” and “deadline time” of various jobs on cloud servers. The cloud broker acts as a teacher and local schedulers of cloud sites act as students. The broker learns to local schedulers how to prioritise users’ jobs according to the values of their attributes. In the deployment phase, the effect of the above attributes on the system throughput is studied separately to select the best attribute. In the service phase, users’ jobs are prioritised by computer systems according to the selected attribute. Simulation results show that the LAJOS approach is more efficient compared to some of the job scheduling methods in terms of schedule length and system throughput.
Mohammad Samadi Gharajeh
Backmatter
Metadaten
Titel
Learning Analytics: Fundaments, Applications, and Trends
herausgegeben von
Alejandro Peña-Ayala
Copyright-Jahr
2017
Electronic ISBN
978-3-319-52977-6
Print ISBN
978-3-319-52976-9
DOI
https://doi.org/10.1007/978-3-319-52977-6