Last week, Leo Casey gave Edwize readers the real story of Pascal Mauclair, whom the NY Post declared was the “at the bottom of the heap” when the DOE released the Teacher Data Reports to the press. The DOE gave Ms. Mauclair a “0” on her report, but the results seemed, to put it mildly, arbitrary. As Casey pointed out, Ms. Mauclair was graded on a small number (11) of high-need (ESL) students who were compared to other students learning in very different, departmentalized, classrooms. Aside from that, Ms. Mauclair has a reputation as an excellent teacher. As her principal said, “I would put my own child in her class.”
All this alone should be enough to clear Ms. Mauclair’s name. But this week fresh evidence shows that Ms. Mauclair’s report should be declared invalid altogether by the DOE.
In order to understand the problem with Ms. Mauclair’s report, you need to understand the important role of other-subject tests in the value-added formulas. When we think of a math report we think of the progress a teacher’s students make in math from the prior year to the current year, weighted for various factors, or variables, such as gender, poverty, attendance, and so on. But what we may not realize is that the most determinative variable is the student’s score from the other subject. In the case of a math report, for example, the prior year’s ELA score is generally considered a better predictor of the students’ next math score than are poverty, gender, or any of the other variables that the formulas factor in.
In fact, the score of the other test is so important that some of the most common value-added formulas do not even bother with any of the other variables. It’s just the tests and the other-subject tests. And some folks say these formulas are actually more reliable than formulas, like the TDRs, which use lots of variables. In any case, whatever we may think of the quality of the Teacher Data Reports, without the other-subject, the results would be less stable than they are.
But a problem arises because some students will have a prior test in one subject but not the other. That is especially the case for English Language Learners. Students new to the country may take a math test the first year they arrive, but their English is so poor that they are waived altogether from the ELA exam.
So what’s a statistician to do when students have prior math scores for a math report, but not the other-subject ELA? Well, rather than simply dropping them from the calculations, the folks who do the calculating basically take a guess. Or, as the experts say, they “impute” a score. They take the available math score, combine it with a bunch of variables and then plug in an ELA score for a test the student never took.
Some statisticians say that imputing the missing score is fairer to teachers (though fairer than exactly what, I couldn’t say). But whatever benefit there may to imputing missing other-subject scores, surely it is a problem in the case of Ms. Mauclair. Let’s go through this step by step:
- In 2009-10 Ms. Mauclair taught 15 students both ELA and math in a 6th-grade class. All of the students were English Language Learners.
- 15 is below the minimum number of 20 students for an ELA report in grade 6, and so Ms. Mauclair was not issued an ELA report. She still had enough students, however, for a math report, where the minimum is 10.
- Next: Four of the Ms. Mauclair’s 15 students were so new to the country that they had not taken the math test the previous year. These four students, therefore, were dropped from the calculations for the math report.
- This left Ms. M. with 11 students, just one student more than the city’s minimum of 10 for a math report. The average 6th-grade math teacher was judged on 50 students for 2009-10, and Ms. Mauclair actually had fewer students than 99% of the teachers to whom she was compared. This matters because the fewer students you have, the less reliable the results. For exactly this reason, the DOE does not make Progress Report measures for groups of less than 15 — a minimum for Teacher Data Reports that they conveniently ignored.
- But the story does not end there. A review of the student level data for Ms. Mauclair shows that four of the remaining 11 students were so new to the country that with their very limited English skills, they were waived from the ELA test the year before. Remember: ELA is that crucial other test.
This means that only 7 of Ms. Mauclair’s 11 students had enough data to compute a real report.
How did the other four wind up back in the calculation? If you’ve been paying attention here (and I know this is kind of in the weeds), then you know that the missing ELA grades were imputed. The DOE created an ELA score for each of the four students who never took the test and put that score into the mix.
Even if we were to agree that imputed scores are a good thing, how can that possibly work for a teacher who had only seven real scores to begin with — fewer than the city’s dead minimum for a report? Nearly 40% of the kids in a very small class were missing crucial information.
What’s more, creating an ELA score based on the math score has to be especially problematic for ELL students. For students proficient in English, there is a relationship between scores in ELA and scores in math. But for first-year ELL kids who cannot speak English? I can think of no reason to assume that the formula machine can know anything about how they would do on an ELA test based largely on how they did in math. And there is no reason to think that the ELA scores of their seven classmates, which were probably also factored in, would be much help. Anyone who has taught immigrants for ten minutes knows that. Not so the DOE.
And the DOE wanted a lot of reports. So, the DOE, cast as wide a net as possible. First it allowed for minimums of ten, and then it allowed for crucial scores to be imputed when those scores don’t exist.
I am not an expert in value-added. In fact, it took me a week to unravel Ms. Mauclair’s report. But had the DOE been motivated, they could have figured this out in an hour — as soon as they saw what the Post had done to Ms. Mauclair and others in the newspaper. But the reaction of the DOE and mayor’s office has been a resounding silence — no one there has stepped forward to defend Ms. Mauclair or any of the employees smeared in the city’s yellow press.
Me, I think they should step forward.
One place to begin — and it is just a beginning — would be to invalidate the report of Ms. Mauclair.
 Regarding the imputation of missing scores: this is never done for the most important score, the same-subject prior tests. In other words, for a math report, one does not create a prior math score when there isn’t one. It is only done for the other subjects.