Over at GothamSchools, Kim Gittleson sheds further light on the charter cohort attrition about which I recently posted. In that post I showed how students disappear at alarming rates from the testing cohorts of middle school charters. As students leave, the cohorts post ever rising passing percentages on state exams.
As I said then, I did not know whether the cohorts were shrinking because the students had been left back a grade or because they had left the school altogether. I also didn’t know the achievement levels of the students who were disappearing. Obviously, if failing students leave and passing students stay, then the passing percentage goes up but not necessarily student performance.
Gittleson helps us out. She looks at BEDS state data, which seems to show that some of these vanishing students were actually left back. Without more information one cannot be sure that these left-back students actually stayed in the school. Regardless of whether they did or did not, however, the sharply rising passing rates in the cohorts seem likely to be influenced by the removal of students with low scores. [UPDATED]This matters. Trip Gabriel lauded Williamsberg Collegiate in a recent New York Times article but never noticed that as students moved up in grades, large numbers disappeared. Bloomberg called Harlem Village (32% attrition in their oldest test cohort) the poster child for New York City charters. Arne Duncan showcased his love of charter schools by visiting Kings Collegiate which experienced an astronomical jump in the passing rate — and an astronomical decline in the number of students taking the test. Meanwhile, gains in passing rates drive charter reauthorization and determine AYP for NCLB. Leaving students back may be an excellent idea. But what can we say about rising passing rates if indeed failing students are removed from the cohort?
|ELA Cohort Attrition Grade 5 to 8 (2006-2009)
|Williamsburg Collegiate||72 to 44||-39%||+35|
|Harlem Village Academy||57 to 39||-32%||+43|
|Leadership Village||59 to 41||-31%||+24|
|KIPP STAR||82 to 61||-26%||+27|
A few cautions. First, anyone looking closely at our two posts will find it very hard to make direct comparisons between Gittleson’s findings and my own. I have not carefully reviewed Gittleson’s data, but it looks as if in many instances we are looking at different cohorts and different time spans. What is more, we are working with very different data sources, reported in very different ways and at different times of the year.
I also think we need more information before we can be sure that students who are reported as left back actually stayed in the school. In many cases the cohorts into which students would be shifted are themselves shrinking. For example, Williamsburg Collegiate loses 12% of its 6th grade students (11 students) between 2008 and 2009. Were those students retained? Maybe. But the cohort those students would join (the upcoming 6th grade ) does not get larger. Rather, it shrinks by 15% (13 students) from the previous year. If that cohort actually absorbed all of the 11 students from the previous cohort then did the upcoming class actually lose 24 students in single year — a whopping third of the class? If students from shrinking cohorts are being retained in cohorts that are themselves shrinking, then students appear to be leaving the younger cohorts at even greater rates then I previously suspected.
Finally, Gittleson posits that perhaps students come into cohorts from waiting lists (in other words, that as students leave, the charters take new students in). If I am reading Gittleson’s example correctly, however, she is looking at Williamsburg’s second group of students. Meanwhile, the older cohort actually loses some students. If Gittleson’s highlighted cohort is showing a “gain” of one student, it seems quite possible that the student came from the older cohort, and not a waiting list.
But that is not a criticism of Gittleson’s work. We are both feeling our way through this, and both seeking answers. It is quite possible that her conclusions are entirely correct. We won’t know until transparency becomes policy, which we hope will happen with the new state law.
Though this post is long, I want to say one other thing. None of what I’ve said here is necessarily a criticism of these charter schools. These may be fine schools. And if they implement a policy of strict retention, and students actually stay, and the end result is a better education, then perhaps we ought to be emulating this in our regular public schools.
But, first, we very much need to know exactly what the numbers in these schools mean, and what they don’t mean. If passing rates are significantly impacted by the removal of struggling students we need to know it, and learn from it, to make all schools — including charters — the very best they can be.
UPDATE/CORRECTION: I just spoke with Kim Gittleson and we went over a few numbers. Note that above I said “Williamsburg Collegiate loses 12% of its 6th grade students (11 students) between 2008 and 2009.” The 12% should be 11%, and the number of students should have been 7, not 11. This means that if the upcoming cohort actually absorbed all of the 7 students from the previous cohort then the upcoming class actually lost (left/left back) 20 students (not 24 as I previously stated). That would be over one quarter of its kids.
Kim sheds further light on these numbers. Her records indicate that one of the 7 sixth-graders, one was retained. Also, Kim’s BEDS files indicate that 11 students in the upcoming 5th grade class were left back. (the total student loss in that testing cohort was 13). These numbers makes sense. In the case of the eleven retained, perhaps we now have at least a partial explanation for the jump in passing rates there (75% to 96%), since we know that 11 students who struggled in the 5th grade class were not in the 6th grade cohort. In the case of the 7 who left the older cohort, we now know that they were not placed into the shrinking cohort below them. What seems most likely is that they left the school.