A supervised learning framework: using assessment to identify students at risk of dropping out of a MOOC

David Monllaó Olivé, Du Q. Huynh, Mark Reynolds, Martin Dougiamas, Damyon Wiese

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Both educational data mining and learning analytics aim to understand learners and optimise learning processes of educational settings like Moodle, a learning management system (LMS). Analytics in an LMS covers many different aspects: finding students at risk of abandoning a course or identifying students with difficulties before the assessments. Thus, there are multiple prediction models that can be explored. The prediction models can target at the course also. For instance, will this activity assessment engage learners? To ease the evaluation and usage of prediction models in Moodle, we abstract out the most relevant elements of prediction models and develop an analytics framework for Moodle. Apart from the software framework, we also present a case study model which uses variables based on assessments to predict students at risk of dropping out of a massive open online course that has been offered eight times from 2013 to 2018, including a total of 46,895 students. A neural network is trained with data from past courses and the framework generates insights about students at risk in ongoing courses. Predictions are then generated after the first, the second, and the third quarters of the course. The average accuracy that we achieve is 88.81% with a 0.9337 F1 score and a 73.12% of the area under the ROC curve.

Original languageEnglish
JournalJournal of Computing in Higher Education
DOIs
Publication statusPublished - 24 May 2019

Fingerprint Dive into the research topics of 'A supervised learning framework: using assessment to identify students at risk of dropping out of a MOOC'. Together they form a unique fingerprint.

Cite this