Full metadata
Title
Identifying relevant interaction metrics for predicting student performance in a generic learning content management system
Description
The growing use of Learning Management Systems (LMS) in classrooms has enabled a great amount of data to be collected about the study behavior of students. Previously, research has been conducted to interpret the collected LMS usage data in order to find the most effective study habits for students. Professors can then use the interpretations to predict which students will perform well and which student will perform poorly in the rest of the course, allowing the professor to better provide assistance to students in need. However, these research attempts have largely analyzed metrics that are specific to certain graphical interfaces, ways of answering questions, or specific pages on an LMS. As a result, the analysis is only relevant to classrooms that use the specific LMS being analyzed.
For this thesis, behavior metrics obtained by the Organic Practice Environment (OPE) LMS at Arizona State University were compared to student performance in Dr. Ian Gould’s Organic Chemistry I course. Each metric gathered was generic enough to be potentially used by any LMS, allowing the results to be relevant to a larger amount of classrooms. By using a combination of bivariate correlation analysis, group mean comparisons, linear regression model generation, and outlier analysis, the metrics that correlate best to exam performance were identified. The results indicate that the total usage of the LMS, amount of cramming done before exams, correctness of the responses submitted, and duration of the responses submitted all demonstrate a strong correlation with exam scores.
For this thesis, behavior metrics obtained by the Organic Practice Environment (OPE) LMS at Arizona State University were compared to student performance in Dr. Ian Gould’s Organic Chemistry I course. Each metric gathered was generic enough to be potentially used by any LMS, allowing the results to be relevant to a larger amount of classrooms. By using a combination of bivariate correlation analysis, group mean comparisons, linear regression model generation, and outlier analysis, the metrics that correlate best to exam performance were identified. The results indicate that the total usage of the LMS, amount of cramming done before exams, correctness of the responses submitted, and duration of the responses submitted all demonstrate a strong correlation with exam scores.
Date Created
2015
Contributors
- Beerman, Eric (Author)
- VanLehn, Kurt (Thesis advisor)
- Gould, Ian (Committee member)
- Hsiao, Ihan (Committee member)
- Arizona State University (Publisher)
Topical Subject
Resource Type
Extent
vi, 76 pages : color illustrations
Language
eng
Copyright Statement
In Copyright
Primary Member of
Peer-reviewed
No
Open Access
No
Handle
https://hdl.handle.net/2286/R.I.36042
Statement of Responsibility
by Eric Beerman
Description Source
Viewed on Janury 19, 2016
Level of coding
full
Note
thesis
Partial requirement for: M.S., Arizona State University, 2015
bibliography
Includes bibliographical references (page 70)
Field of study: Computer science
System Created
- 2015-12-01 07:05:28
System Modified
- 2021-08-30 01:26:19
- 3 years ago
Additional Formats