MVLRI® has launched a series of quantitative research reports exploring characteristics of students in state virtual school courses, specifically focused on those who took courses for credit recovery (CR). Using Michigan Virtual School® (now known as Michigan Virtual™) data, the second and third reports in the series illustrated data-driven learning profiles focused on motivation to complete course assignments and academic time investments in courses. Algebra 2 courses were targeted and time series clustering methods were used for the two variables: weekly attempted scores and weekly totals of minutes recorded in the learning management system (LMS). The final report of this series was to extend the work exploring learning profiles to other subject areas most frequently taken by credit recovery (CR) students: Algebra 1, English Language & Literature 9, and U.S. History & Geography 1. We discussed clustering results as a way of providing data-driven benchmarks for the optimal course behavior patterns, which may be used by instructors and course mentors for guidance in monitoring students’ progress.
Jemma Bae Kwon, Michigan Virtual Learning Research Institute
WHAT WE ALREADY KNOW ABOUT THIS TOPIC:
A survey study has demonstrated that students tended to show more negative perception for mathematics courses than for English and social studies courses. They perceived the online learning experience with mathematics courses as being less successful, rich, and recommendable for other students.
It is important to ensure for students an effective means of asking questions; available in-time support; teachers’ demonstration, modeling, and additional explanation; a means of supplementing a lack of experience with online mathematics tools and prerequisite skills; and opportunities for collaboration. The authors also highlighted that students putting forth a sufficient effort on studying was critical.
WHAT THIS REPORT ADDS:
This study has tried to fill the gap in research by focusing on students’ behaviors in courses. Data on attempts to complete the course assignment generated the largest group with the profile of “consistent and persistent coursework throughout the semester” for both mathematics and non-mathematics subject areas.
When it comes to academic time, the representative cluster did not result from the profile of significant time investments as a result of consistent coursework or multiple attempts to increase in academic time. Rather, the largest group featured a sharp surge of the time investment at a particular time segment, for instance, the final week.
Observing these results with course completion status (i.e., passing or failing members) carried implications that the characteristics of a greater amount of academic time and/or multiple peaks in time investments could be related to the successful learner group in non-mathematics courses, but not necessarily in mathematics courses.
IMPLICATIONS FOR PRACTICE AND/OR POLICY:
Data-driven benchmarks carry strong implications for instructional practitioners, e.g., teachers and academic mentors. As student autonomy is important in terms of pacing, the pacing guide suggests some benchmarks (timeframe for tasks, how many, and which ones, etc. a student should finish to be successful) or practitioners’ personal standards for monitoring student pacing and progress.
However, the pacing guide is more of an ideal expectation, and practitioners’ personal standards are not necessarily consistent and may not reflect “evidence-based practices.” However, data-driven benchmarks are more real and “evidence-based,” providing information from analyzing real data. Therefore, the following data-driven benchmarks for the optimal course behavior pattern may be helpful for practitioners to monitor students’ progress over time.
For the regular semester, passing members of the persistent engagement group in mathematics courses showed a pattern of attempting 13% of course points by week 5, 36% by week 10, 59% by week 15, and 94% at the final week. The non-mathematics results retained a great similarity with attempting 14% of course points by week 5, 30% by week 10, 54% by week 15, and 91% at the final week.
In regard to academic time focused on LMS records, passing members of the “final surge” group indicated time records with 386 minutes for the first five weeks (Q1), 271 minutes for Q2, 227 minutes for Q3, and 750 for the last five weeks. Benchmarks for the non-mathematics were found from the group profiled as “frequent small peaks” with 464 minutes for Q1, 668 minutes for Q2, 528 for Q3, and 668 minutes for the final weeks.
Summer semester’s mathematics benchmarks were obtained at the profile of “persistent engagement followed by a slow start” with 0% of course points by week 2, 11% by week 5, 41% by week 8, and 80% at the final 10th week, while non-mathematics benchmarks were procured from the profile of “persistent throughout the semester” with 4% of course points by week 2, 37% by week 5, Week 80% by week 8 and 94% at the final week.
Regarding time recorded in the LMS, benchmarks included the mathematics cluster profiled as “significant time investments from the beginning of the semester” with 454 minutes during the first two weeks, 1,099 for the next three weeks, 467 from week 6 to week 8, and 214 during the final two weeks. Non-mathematics benchmarks represent the passing members’ profile of “persistent throughout the semester” with records of 109 minutes during the first two weeks, 433 for the next three weeks, 707 from week 6 to week 8 and 1608 during the last two weeks.