Tracking Progress & How School Data Is Manipulated: A Parent’s Guide

When your child has an IEP, progress monitoring isn’t just a box schools check off—it’s the heart of whether the plan is working. The data schools collect is supposed to tell you if your child is growing, whether instruction needs to be adjusted, and how future goals and services should be shaped. Most importantly, it should give parents an honest picture of their child’s learning. Unfortunately, sometimes the data parents receive is not what it seems. Numbers can be inflated, tests misused, or results misrepresented, leaving families with the impression that progress is happening when, in reality, it is not.

The purpose of progress data is simple: it should show whether the IEP is working. It should guide instruction, help the team adjust strategies when needed, and keep families informed so they can be equal partners in decision-making. When that data is wrong, every decision that flows from it—placement, services, even whether a child’s goals remain appropriate—can be wrong too.

Parents need to be alert for red flags. Sometimes schools give repeated practice runs of the same test until the child improves, then record the final attempt as the true score. In other cases, teachers may preteach the material right before an assessment, or provide subtle prompts during a task that was supposed to be independent. Schools may use easier versions of tests without documenting the change, or they may report scores without grade-level standards for context. These tactics may make a student appear more successful on paper, but they do not demonstrate true mastery.

There are also common ways data is directly manipulated. A failing score might be raised to an “average” by altering how the grade is calculated. Modified assessments may be given without noting that they were different from the original. A teacher might reread a passage before timing a reading fluency test, invalidating the results. Sometimes accommodations are applied in ways that actually erase the purpose of the measurement. In too many cases, subjective observations—such as “he seems to be doing better”—are presented in place of actual data. These practices do not measure growth. They hide the lack of it.

Equally concerning is when schools use inappropriate or irrelevant assessments to claim progress. Report card grades, for example, tell little about whether an IEP goal is being met because they reflect effort, participation, and classroom behavior as much as academic skill. Curriculum-based tests may not measure the specific skills targeted in the IEP. Teacher-created tests or work samples often fail to show whether the child completed the work independently. On the other hand, truly appropriate tools include research-based programs such as DIBELS or Wilson, standardized rubrics, and data sheets that match the IEP goals directly.

Even when the right tools exist, data collection can still be invalid. If a student receives adult help during the task, or if the assessment does not align with the skill being measured, the results are misleading. Likewise, if baseline data or trend lines are missing, there is no way to know whether real progress has occurred. Perhaps most concerning is when low scores are dismissed as a matter of poor effort or lack of focus rather than being taken seriously as an indicator of need.

One widespread problem is the misuse of the FAST test. Parents are often told that a score on this benchmark assessment shows whether a child has met their IEP goals. In reality, FAST was never designed for that purpose. A Level 2 score does not prove mastery of a reading or comprehension goal. If the IEP is targeting fluency, comprehension, or inference, then tools such as DIBELS, Lexile levels, or reading probes should be used instead.

Another hallmark of real data is the presence of a trend line. Every IEP goal should be supported by data collected at regular intervals, displayed in a visual chart or graph, and showing a trajectory of expected growth. If a teacher cannot show you a graph, they likely are not tracking progress in a meaningful way.

Parents should also listen closely when data is explained. Be cautious if a teacher cannot interpret their numbers, relies only on observations or feelings, or compares your child to classmates instead of grade-level standards. Blaming the student for lack of focus or effort is another warning sign. Data is only useful when it is objective, reproducible, and unbiased. In practice, clean data is data that is directly aligned to the IEP goal, collected under the right conditions, taken from standardized tools, and based on independent student performance. Dirty data, by contrast, often comes from group lessons, includes adult prompts, measures unrelated skills, or relies solely on grades.

To safeguard against misleading data, parents should consistently ask questions. It is reasonable to request to know exactly what tool was used, whether the task was completed independently, and whether the assessment was standardized or simply curriculum-based. Parents should also ask to see a chart or trend line and, when possible, the actual student work that supports the reported progress. Documenting these answers creates a record that can protect both you and your child.

Legally, schools also have responsibilities. Within thirty days of an IEP’s start, they must begin tracking each goal, provide a copy of all assessment procedures, use objective and measurable tools, and report progress according to the schedule outlined in the IEP, usually every four and a half to nine weeks. If this does not happen, parents should request a meeting and put their concerns in writing.

The most important action parents can take is to stay engaged. Review IEP goal data every grading period, ask to see the raw data rather than just summaries, compare school reports with what you see at home, and keep copies of everything. If you notice red flags, do not hesitate to raise them formally.

At IEP Partner, we often hear from parents who feel lost in a sea of confusing numbers or reassurances that “everything is fine.” Our role is to help you make sense of the data, identify red flags, and take action when progress is overstated or simply fabricated. Every child deserves more than inflated numbers; they deserve real, measurable growth that sets them up for success.

The takeaway is simple: do not take progress reports at face value. Ask questions, dig into the details, and insist on clear, accurate information. Real progress is not about numbers on a page—it is about your child gaining the skills and confidence they need for the future.

Next
Next

Why Advocacy Is All About Teamwork: Building Strong Relationships with Your Child’s School Team