Last week, the DOE announced the final list of schools it wants to close, and attached to it came the usual press release designed to justify their continued implementation of a failed policy. The release was so clearly misleading that very few people in New York City would believe a single thing it has to say. But since the folks at Tweed have ambitions bigger than the five boroughs can contain, and because the rest of the country might actually believe them, a few corrections are in order. So here you go:
DOE says: “Un-screened high schools opened since 2002 continue to earn higher grades and have better graduation outcomes than un-screened high schools opened before 2002.”
First of all, the new schools are not unscreened. Under Bloomberg, the DOE instituted what it calls a ‘limited screen” policy, and that policy does not work well for many at risk kids. Limited screening gives first preference to students who have actively made themselves “known to the school.” After that, the preference goes to any student in the borough, rather than to kids from the neighborhood. To be known to the school students must attend an admissions fair or have put themselves forward by some other means. Like lottery admission systems, limited screening tends to bias in favor of families that are engaged in the process. In the case of limited screening though, that bias is exacerbated by the fact that the screening is embedded in a complex process that students must navigate, wherein they choose 12 schools and rank them. For a better understanding read Darwin or, more simply, see two New York Times articles, here and here.
The schools are screened in another way as well. Historically, the DOE’s high school admissions catalogue shows that the new schools did not offer the small, special classes required for self-contained special education students, the group of students that struggle most in school and are most likely to influence school outcomes (see here and here). By excluding the classes from their admission pages, the DOE effectively excluded the students. This year, under pressure, the DOE changed its wording, and we shall wait to see how that goes.
In any case, the new schools are not “unscreened.”
As far as “better outcomes” goes, once the news schools reach high concentrations of high-need students, they have a so-called failure rate that is the same as older schools and that includes the low graduation rates. My post from a few weeks ago is about exactly that. New schools also have lower rates of college readiness when compared to older schools.
DOE says: “Graduation rates at new schools are higher than the schools they replaced.”
Here DOE is comparing new schools to the old school that used to exist in the same building. Graduation rates might be higher, but the populations are not the same. Here, for example, is a chart showing Columbus, which is closing, and its replacement schools. As the population of self-contained students rises, the school’s grade on the Progress Report declines.
Graduation rates parallel the Progress Report grades, rising as the percent of self-contained students decreases. Global Enterprise, the new school with higher populations and a C is also closing.
And here is JFK (another closing school) and its replacements:
I was able to chart Columbus and JFK because these schools are still phasing out, and so the data is available. In its press release, however, DOE chose a much earlier example, where public data is not available. Let’s look at that.
DOE says: “… the new schools we have opened on the Van Arsdale campus in Brooklyn had a graduation rate of nearly 83% in 2010 – 38 percentage points higher than the 2002 graduation rate at Van Arsdale High School. The new schools on the Van Arsdale campus are achieving these results with a similar population of students that were served by the school they replaced.”
These sound like wonderful schools on the Van Arsdale campus, and the anecdotal evidence I can gather suggests the same. But wonderful or otherwise, the replacement schools are not similar to the school that they replaced.
For starters, like all new schools Van Arsdale’s replacements had limited-screened admissions that gave preference to students who had actively engaged in the admission process. And remember those admission catalogues I mentioned, where the policies effectively excluded students who needed self-contained classes? Well, all three of the schools that replaced Van Arsdale have followed that policy and are serving no self-contained special education students. They are also serving the lowest percentage of overage students in the Van Arsdale neighborhood. In fact, all three schools show up in the top half of the DOE’s need (“peer”) index. In other words these three schools have lower needs than at least half of the schools in the city (they are at the 75th, 58th and 57th percentile). Virtually all NYC high schools serve high need students, but depending on which of these three schools you mean, between 250 and 300 schools have higher needs.
So how does that stack up against the now closed Van Arsdale? DOE selected an example where direct records on the student population are not publically available, but there are some things we know. When DOE moved to shut it down in 2002, Van Arsdale was a neighborhood school, serving neighborhood kids in Williamsburg Brooklyn well before gentrification. Soy lattes were hard to come by, and living sustainably meant a roof over your head and regular electricity, not a visit to Whole Foods and organic cotton in the children’s hats. It would absolutely strain credulity to believe that Van Arsdale was serving the same population as the new schools. In fact, the closing schools are virtually always notable for being those with the very highest concentration of high-need students in their neighborhoods, and often citywide. And that would include large numbers of self-contained students.
Old schools and the new schools that replaced them are not the same. For a more thorough discussion of the population differences between old and new schools, see this 2010 report by Jennifer Jennings and Aaron Pallas.
Finally, one more claim by the DOE.
DOE says: “schools earned a higher percentage of A’s and had a higher average percentile rank than non-charters, led by CMO-affiliated schools and charter middle schools.”
This is highly misleading and only true in a technical sense. When DOE made this claim a few months ago, I posted my response about the middle schools and you can read it here. Most of the charter middle schools that got an overall A received that A even though they did not have an A in any category for academic achievement.
In some ways, the DOE press releases have become nearly laughable. Surely they must know, as the rest of NYC knows, that what they are saying isn’t really true, and that they owe all of us a more balanced representation of how our schools are faring under Bloomberg’s watch. But of course, it is not really laughable. DOE is a public institution, not a corporation hawking a product during prime time. It has a responsibility to the public, and I think it would just be really a very good thing for the schools if DOE reported factually, and came clean with the far more nuanced story of what they surely know.