Contrast

Pacing Guide for Success in Online Mathematics Courses

Assorted Electronics on table
Facebook
LinkedIn
Twitter
Pinterest

Jemma Bae Kwon & Kristen DeBruler

Online gradebooks situated within a learning management system (LMS) go far beyond simply a space to record student grades. They offer a dynamic tool for monitoring student progress. For gradable course content (either auto- or instructor-graded), reports can be easily generated detailing how many course points a student attempted to earn and how many points that same student actually earned. All of this is time-stamped by the LMS to provide chronological information as well.

In 2018, using students’ grade histories on a month-by-month basis from the 2015-16 academic year, Michigan Virtual Learning Research Institute (MVLRI) released a report on students’ learning trajectories in Michigan Virtual online mathematics courses over the course of a single semester (Kwon, 2018). This study utilized the monthly greatest score as the dependent variable, allowing the researchers to observe the same student at different points in time throughout the semester and in turn, to understand changes in student behavior (i.e., earning course points by completing gradable content) over the course of the semester. This blog post details a continuation of that research through the 2017 academic year.

Growth modeling (GM) was an excellent methodological fit for the aforementioned study. However, GM relies on an assumption of homogeneous populations, something that was not satisfied in this iteration of the research. For instance, there would be differences in both the learning behaviors and outcomes between students taking AP Calculus because it was not available in their local school and students taking Algebra I to recover previously failed credit. With that in mind, the researchers specifically selected growth mixture modeling (GMM) as it is considered a robust method of analysis that does not rely on the assumption of a homogenous population. This analytic approach was designed in such a way that it first identified distinct groups of individuals (i.e., clusters/classes) and then performed growth modeling (i.e., trajectories of respective clusters).

The final data set for the 2018 study included the greatest earned scores at five occasions (one per month). The model estimation was conducted to examine multiple possible scenarios regarding the number of distinct groups (clusters) and shape of trajectory that best fit the data. As a result of reviewing model-fit indices, the cubic model with four-latent-class was chosen. This model indicated four profiles as learning trajectories in online mathematics courses, including (a) a nearly linear growth; (b) a steep increase in student scores as the end of each semester approached; (c) hardly any growth over the semester; and (d) strong early achievement. The first profile (nearly linear growth) constituted the largest group with approximately three quarters (72.7%) of enrollments in the study sample.

This profile of nearly linear growth over the course of the semester is indicative of the tendency to complete gradable course content on pace, and not postpone completion of content until the final weeks of the semester. This pattern of linear growth is one promoted by the pacing guides within each course.

Given that most students fell into this linear growth trajectory, researchers wanted to examine other academic years for similar trajectory profiles. Should the profile continue to be abundant, this research may provide ways for instructors to better identify students who are likely to remain on pace and conversely identify those who are likely to fall behind.

To build on the 2018 study, researchers selected the 2016-17 academic year for analysis. Data from the 2016-17 year had some notable differences from the 2015-16 year. Consumer and Foundation mathematics courses were excluded from the current study, as were AP Statistics courses that were hosted through a third-party provider and not on the Michigan Virtual LMS. Also, in the 2016-17 school year, Michigan Virtual launched a specific credit recovery program, which altered the student composition of non-credit recovery courses.

The results of the GMM for the 2016-17 data identified the linear model with two-latent-class as the best-fit model. A vast majority of students (99.65%) were members of class 1. Table 1 summarizes results of both the 2015-16 and 2016-17 analyses. GMM for both academic years identified clusters sharing similar linear trajectory profiles. The majority clusters (class 4 for the 2015-16, class 1 for the 2016-17 academic year) were then split into four sub-groups based on earned score in the final month of the semester in order to provide practical implications.

Table 1. Growth Mixture Modeling Best Fit Model Results
Four-class cubic model Two-class cubic model
Class 1 (1.3%) hardly any growth Class 1 (99.65%)
Class 2 (12.2%) early completion Class 2 (0.35%)
Class 3 (13.8%) steeper as approaching final
Class 4 (72.7%) nearly linear
Table 2. Trajectory of Linear Growth Profiles
Final Month’s Score (S) Class 4 in 2015-16 Study
(n = 1,328)
Class 1 in 2016-17 Study( n = 1,149)
%1 M12 Avg. M2 Avg. M3 Avg. M4 Avg. % M1 Avg. M2 Avg. M3 Avg. M4 Avg.
S ≥ 90 42.8 17 40 61 81 36.2 11 29 46 69
90 > S ≥ 80 28.8 14 32 50 72 26.7 8 23 37 58
80 > S ≥ 70 16.9 11 25 39 60 15.6 6 19 31 49
70 > S ≥ 60 9.2 8 19 31 51 6.6 5 15 24 39

Table 2 presents monthly average score by performance groups based on final month’s scores, for those groups whose final month score was greater than or equal to 60%. That is, students who fell into the linear cluster were divided into the four groups, such as the highest group whose final month’s scores were greater than 90, the second highest group whose final month’s scores were between 90 and 80, and so on.

According to Table 2, the highest performing group for the 2016-17 academic year (those students with a last month average greater than or equal to 90% of course points) indicates that at least 30% of course points were earned in the fourth or fifth month of the course. The high performing groups from the 2015-16 study demonstrated a more clear linearity in the trend across months and a robust growth from the beginning to the end of the semester. Specifically, students whose earned scores in the final month were at least 80% of course points showed an increase in scaled scores by approximately 20% month-by-month. This may be attributed to the course pacing guidelines which follow a similar pattern.

Meanwhile, the group of relatively low performance in the final month (final month’s score between 70% and 60%) was characterized by an increase in scaled scores by approximately 10% points from the first to the second month and from the second and the third month. This growth pattern in turn required students to earn a significant portion of course points in the final months of the course, an unrealistic and unattainable expectation. This pattern of low month-by-month increases may be a Maginot line for success in mathematics courses, that is to say that students may be largely unable to overcome such small growth early in the course. Practically, if instructor progress monitoring shows completion of only 10% of course tasks during the first two months of a course, that student is unlikely to successfully complete their course and may benefit from additional motivation, affective, and cognitive supports.

This research, both the original 2018 report and the expanded follow up using 2016-17 data, provides a more robust understanding of patterns of student behavior that lead to success and patterns that suggest students are in need of instructor intervention and support. These data are intended to provide more information to instructors, to help them identify struggling students sooner, and be able to get those students back on track before the end of the semester. Certainly averages and probabilities can never replace instructor experience and intuition, but this research can support instructors and provide valuable data to inform their decision-making. We hope to continue this, and similar lines of research, in the coming years to provide clear, accurate, and real-time progress monitoring and flagging for online instructors.

References

Kwon, J. B. (2018). Learning trajectories in online mathematics courses. Lansing, MI: Michigan Virtual University. Retrieved from https://mvlri.org/research/publications/learning-trajectories-in-online-mathematics-courses/

Jemma Bae Kwon

Jemma Bae Kwon is an assistant professor in the Department of Teaching Credentials at California State University—Sacramento. She is working with special education and elementary school teacher candidates in the courses of curriculum and instructional strategies for students with special needs and mathematics curriculum and instruction for the diverse K-8 classroom. Her research interests span both mathematics learning and teacher education.

Kristen DeBruler

Dr. Kristen DeBruler received her doctorate in Educational Psychology and Educational Technology from Michigan State University. She taught in the Master of Arts in Educational Technology program at Michigan State University for three years. Her work focuses on K-12 online learning policy in Michigan and nation wide as well as understanding online learning best practices.

Facebook
LinkedIn
Twitter
Pinterest
Picture of Michigan Virtual Learning Research Institute

Michigan Virtual Learning Research Institute

The Michigan Virtual Learning Research Institute (MVLRI) is a non-biased organization that exists to expand Michigan’s ability to support new learning models, engage in active research to inform new policies in online and blended learning, and strengthen the state’s infrastructures for sharing best practices. MVLRI works with all online learning environments to develop the best practices for the industry as a whole.

The Digital Backpack

Get our latest articles sent straight to your inbox every week!