Clicks and Grades: Relationship Between Student Interactions With A Virtual Learning Environment and Outcomes
Introduction
The recent growth in online courses has led to an increase in learning Β options for both traditional and nontraditional students. Β In these courses, students use a Virtual Learning Environment, VLE, to simulate the experience of a real world classroom. Aside from the obvious benefits for students, one that is just now being explored is the ability to assess, in real time, the level of interaction with the course material. Β Recent work has shown that such information can be used to spot students in danger of failing and intervene in a timely manner. Such interventions, however, require human resources which are costly and not scalable. As such, analytics data from the Online University in the United Kingdom was analyzed to see if there are any clear differences in usage between passing and failing students. The hope is that such information can be used in the creation of an automated tool to help students improve their overall grades.
The Data
The OULAD dataset, as it's known, consists of 7 csv files totaling more than 450 MB in size. There is information for seven courses, each running from February to October. And there is two years worth of data is available. Student demographics and time-coded student interactions with the VLE make up the bulk of the data and were the focus of the analysis.
Clicks
The above histograms show the breakdown for student access, clicks, for the same class held a year apart. Clear from the plots, is that most students show between 0 and 2,000 clicks. So are there differences between the number of clicks of the best students and those that failed?
Content Access Time
Not only is the number of clicks per student important, but also the mean time they take to access content relative to each other. Where most students seem to take similar times to access content, a smaller percentage either take very long or very little time. The question is do the student who take the least time do the best? Β Likewise, do the students who take the longest end up failing the course?
Clicks and Grades
The relationship between the number of clicks and the overall student outcome can be seen in the scatter plot above. Students who failed, green dots, exhibit much less interaction with the VLE versus those who passed. Unsurprisingly the best students, on average, were accessing the VLE more than anyone else.
Mean Access Times
From analyzing the mean access time scatter plot, students who pass or do well have a much narrower range of access times vs. the student who failed. As a matter of fact, students who failed were accessing the content earlier than those who passed which was a bit surprising. Β One would assume that the earlier a student accesses content, the better they would likely do.
Age and Gender
When the level of access by age category is compared, the results were unexpected. One would assume that younger students (< 35 years), being "more" tech savvy, would be accessing the course content more than the older students. Β This isnβt the case according to the data. Students in the 55 and over category were the ones using the VLE the most. One possibility for this is that younger students may be saving the content to their local devices so they have less of a need to continually interact with the VLE. Β As expected, there seems to be little difference between the interaction levels between men and women, though statistical testing might show otherwise.
Conclusion and Recommendation
Students who interact more with the VLE tend to perform better. Β In contrast, students who access the materials early tend to perform worse. Β Interaction level also seems to be age related, but gender doesnβt seem to play much of a role. From these results, a data driven widget could be added to the VLE to encourage students to interact with the VLE in meaningful ways based on their current grades and demographics. This approach would be both scalable and individualized.