7 Do This, Not That Tips for Better Progress Monitoring
When monitoring the progress of students who are receiving intervention, schools spend a lot of time setting the right goal and planning when to collect data each week (or sometimes every other week). The hardest part, though, is yet to come. The entire purpose of progress monitoring data is to take action on it.
Here are 7 Do This/Not That practices to help teams make better decisions with progress monitoring data.
Not That: React to every dip or spike in the data.
Do This: Look at the overall trend.
A single data point — whether a dip or a spike — rarely tells the full story and can lead teams to overreact or underreact. Focus on the direction and consistency of the line over time before drawing any conclusions.
Not That: Change course too quickly before the intervention has had time to work.
Do This: Give instruction enough time to have an effect.
Not That: Assume that any upward trend in a student's data means the intervention is working.
Do This: Compare the student's growth rate to that of their peers.
A student can be making progress while their achievement gap from peers continues to grow. If the gap between a student and grade-level expectations continues to widen, the current intervention may need to be intensified, even if the student is making gains.
Not That: Assume that a student's lack of progress is about the student.
Do This: Consider the many reasons a student may not be responding, most of which are not about the student at all.
Schools are complex places and most commonly, the intervention is not being implemented as intended. Additionally, it may be that the intervention cannot be delivered with a strong enough dosage, or that it isn't targeting the right skill gap. Shifting the lens from the student to the instructional system often reveals practical changes to student instruction and intervention.
Not That: React to individual data points.
Do This: Look at the overall trend.
A single data point, whether a dip or a spike, doesn’t tell the full story and can lead to time wasted in meetings discussing things that aren’t focused on next step actions. Focus on the overall trend of the data, as that’s the purpose and power of progress monitoring.
Not That: Change course too quickly before the intervention has had time to work.
Do This: Give instruction enough time to have an effect.
Research suggests that most evidence-based interventions need at least eight to ten weeks of consistent implementation before their true impact can be measured. Changing course too soon doesn’t give students the time they need to close skill gaps.
Not That: Review only student progress graphs during progress monitoring meetings.
Do This: Review student and implementation data at the same time.
A student's graph only tells part of the story if you don't also know whether the intervention was delivered as intended. An intervention that is implemented inconsistently, cannot be expected to produce consistent results.
By understanding these misconceptions, we can make better decisions. In the end, it is all about making informed decisions that support students. If you would like more guidance on setting goals, collecting progress monitoring data, and using that information to take action, I invite you to learn more about our MTSS Data Academy. Our team is here to help your staff make the most of their data.
1. Fuchs, L. S., & Fuchs, D. (2006). Introduction to response to intervention: What, why, and how valid is it? *Reading Research Quarterly*, 41(1), 93-99.