Are these Changes an Improvement? Using Measures to Inform Homework Practices
Item
Title
Are these Changes an Improvement? Using Measures to Inform Homework Practices
Abstract/Description
Although the Carnegie Math Pathways has been dramatically successful at significantly increasing the success rates of students that place into remedial mathematics (Huang et al., 2016), the members of the Pathways NIC are committed to continuously improving those outcomes. This effort involves investigating why students are not succeeding, identifying high leverage areas to target changes in order to enact further improvements, and supporting the testing of changes in practice. These activities are supported by the common measurement infrastructure of a NIC (Bryk, et al., 2015; Yeager et al, 2013).
Students in Pathways courses, Statway or Quantway, complete their out-of-class assignments on an online platform, thus providing an embedded measurement opportunity where patterns in students engagement out of class can be analyzed while being minimally intrusive (See Paper 3 in this symposium). Exploring variation in online platform performance data has revealed that students who demonstrate strong initial performance in homework during the first module of the course are more likely to successfully complete and pass the course. This has led the Pathways NIC to identify successful homework completion in the first four weeks of the term as a high-leverage improvement priority.
Members of the NIC have developed a number of promising homework practices that have been tested and refined in individual classrooms using Plan-Do-Study-Act cycles (Langley, et. al, 2009). While tests have suggested these practices are promising, we have yet to identify which are most effective in improving students homework habits, and in which combinations. The practical benefits of identifying the best combination of interventions are clear: instructors can target their energy toward practices that make a significant difference and avoid wasting resources on practices that do not.
We present findings from an “improvement sprint” focused on increasing Pathway students’ homework completion in the beginning of the term. The sprint uses a planned experiment design in order to determine whether the proposed changes are an improvement, and how they work across contexts. A planned experiment is different from other process improvement work in which people test changes over time, iterating based on real time feedback from granular level measures (Moen et al, 2012; Provost et al, 2011; Langley et al, 2009). The benefits of this approach to testing include the ability to discern interaction effects between change ideas, and to learn efficiently about an event that occurs infrequently, such as the start of a new term.
In this study, three change ideas (i.e., promising homework practices) serve as factors in the experiment, and twenty instructors within a community college are each assigned a combination of factors to enact over the first four weeks of the term. The response variable, students’ homework completion, is evaluated at the end of the first four weeks and again at the end of the term to identify efficacious practices and combinations of practices. This paper reports the findings as well as learnings about enacting this type of embedded experiment in practice.
Students in Pathways courses, Statway or Quantway, complete their out-of-class assignments on an online platform, thus providing an embedded measurement opportunity where patterns in students engagement out of class can be analyzed while being minimally intrusive (See Paper 3 in this symposium). Exploring variation in online platform performance data has revealed that students who demonstrate strong initial performance in homework during the first module of the course are more likely to successfully complete and pass the course. This has led the Pathways NIC to identify successful homework completion in the first four weeks of the term as a high-leverage improvement priority.
Members of the NIC have developed a number of promising homework practices that have been tested and refined in individual classrooms using Plan-Do-Study-Act cycles (Langley, et. al, 2009). While tests have suggested these practices are promising, we have yet to identify which are most effective in improving students homework habits, and in which combinations. The practical benefits of identifying the best combination of interventions are clear: instructors can target their energy toward practices that make a significant difference and avoid wasting resources on practices that do not.
We present findings from an “improvement sprint” focused on increasing Pathway students’ homework completion in the beginning of the term. The sprint uses a planned experiment design in order to determine whether the proposed changes are an improvement, and how they work across contexts. A planned experiment is different from other process improvement work in which people test changes over time, iterating based on real time feedback from granular level measures (Moen et al, 2012; Provost et al, 2011; Langley et al, 2009). The benefits of this approach to testing include the ability to discern interaction effects between change ideas, and to learn efficiently about an event that occurs infrequently, such as the start of a new term.
In this study, three change ideas (i.e., promising homework practices) serve as factors in the experiment, and twenty instructors within a community college are each assigned a combination of factors to enact over the first four weeks of the term. The response variable, students’ homework completion, is evaluated at the end of the first four weeks and again at the end of the term to identify efficacious practices and combinations of practices. This paper reports the findings as well as learnings about enacting this type of embedded experiment in practice.
Author/creator
Date
At conference
AERA Annual Meeting
Resource type
Research/Scholarly Media
Resource status/form
Presentation/Poster
Scholarship genre
Empirical
Open access/full-text available
No
Peer reviewed
No
Citation
Meyer, A., Grunow, A., & Krumm, A. E. (2017). Are these Changes an Improvement? Using Measures to Inform Homework Practices. AERA Annual Meeting, San Antonio, TX.
Comments
No comment yet! Be the first to add one!