[Editor’s note: This post was co-written by Rhonda Rosenberg and Tina Collins.]
In the past weeks, two new studies have come out about the impact of charter schools on math and reading test scores. One focuses on a small sample of students in a single school (Harlem Success Academy), the other on the students at 22 KIPP schools around the country (including the original KIPP Academy; STAR College Preparatory; AMP Academy; and Infinity Charter in NYC). Both were paid for by the schools they studied, and both conclude that the charter schools were more successful in raising test scores than local public schools.
However, both studies have methodological problems which make some of their conclusions questionable. Neither study fully accounts for the impact of charter schools’ lower proportions of English Language Learners and special education students, and both fail to distinguish between high needs special ed students and those with less severe learning challenges, and between students who receive free vs. reduced price lunches.
A table from the HSA report (reprinted in part here) shows the demographics of the HSA students and the third graders in the comparison elementary schools. There seem to be more dissimilarities than similarities between these schools. For example, HSA had no English Language Learners (ELLs) and only 15% special ed students amongst its third grade students, while 21% of the students in the comparison group were ELLs and special education.
Table 1: Demographic Dissimilarities between HSA and Comparison Schools
|Demographics||HSA 3rd Graders||Comparison 3rd Graders|
|% English Language Learner||0%||21%|
|% Special Education||15%||21%|
As the UFT and others have shown in recent reports (Separate and Unequal; Special Education in Charters and District Schools), such differences in student demographics are a significant factor in shaping disparities between charter and district schools, especially in terms of how peers’ background and achievement affect test scores for individual students. It is also troublesome that the HSA researchers did not disclose the names of the comparison schools and allowed HSA a voice in choosing the schools. Perhaps there was selection bias in the choice of comparison schools. Or maybe HSA schools are enrolling very different students, with fewer academic challenges, than those taught in neighboring district schools.
HSA also experienced a 24% drop in students from its original cohort of 97 students, but only the KIPP study even attempts to address the problem of high attrition in charter schools (another issue we’ve covered here), noting that students with lower test scores are more likely to leave both KIPP and district schools. However, as Professor Gary Miron has noted, its authors fail to acknowledge the impact of charters’ policy of not admitting new students to fill the places of those who leave (unlike public schools, which must admit new students into higher grades as spaces open up).
Given the possibility of selection bias in the HSA study coupled with the fact that a small sample of students from only one grade was studied (and one grade does not make a school), it seems that the HSA report has little value. In general, it looks like neither the HSA or KIPP report can address the question of whether charter schools’ higher test scores are due to the type of students who attend (and leave) them, rather than to any innovative educational practices they may be implementing.