Log in  |  Search

Teacher Data Reports — 4000 Unreliable, 100% Wrong

The Daily News reported yesterday that fully 1/3 of the Teacher Data Reports — 4000 reports — are unreliable.

And just to add a little context:

  1. They all have multiple years of data, which are supposed to be more reliable.
  2. Hundreds and hundreds have margins of error of less than 10 percentage points — five on either side — giving the public, parents, and teachers assurances that these reports are quite correct.
  3. In fact, the DOE was so confident in its findings that 46 of these reports had no margins of error at all!

That’s 4000 reports. And that means since teachers are ranked against each other, that all the reports are unreliable.

Or, let’s just get right to it: “unreliable” is a euphemism for wrong.

The DOE knew the reports were wrong and assured the Daily News that it “shared this lesson with our schools and with the state,“ but we know that’s a lie, unless by “schools” you mean the bricks and mortar, rather than the thousands of teachers who were victims, the principals who worked with them, or even the parents who would eventually see the ratings in the local news. So far as we can tell from the article, in fact, the DOE’s grand transparency didn’t extend beyond cluing in the superintendents about a handful of teachers who were up for tenure last spring.

And I’m guessing (just a guess) they didn’t do that in an email blast. I’m guessing (just a guess) it was by phone.

In fact, the DOE has been engaged in one big cover-up all along on these reports. Just look at Edwize. In May of 2011, I posted that teachers who worked with high-performing math students were 40 times more likely to rank in the bottom 5% of all teachers than to rank in the top. Since the teachers should fall along something of a curve, the lopsided results meant the reports were biased.

Less than a week later, DOE suddenly added several charts to its online FAQ. The charts were designed to show that there was “no systematic relationship between” teacher scores and student achievement. It was a claim that was misleading at best, especially when one considers that even as they were making it, they were probably making phone calls to those superintendents. The new information also referred to “sensitivity” in some reports. There was no way, however, to connect that “sensitivity” to real reports, and no way to gauge the extent of the disaster.

In any case, the claims were basically bogus and I pointed that out in a second post.

Then, at some point between May and the release of the reports in February, DOE pulled all that that documentation out of the FAQ.

Meanwhile, days after the release of the reports, Bloomberg — who either had DOE people lying to him or else surely knew that the reports were junk — told the public, “Parents have a right to know every bit of information that we can possibly collect about the teacher that’s in front of their kids.”

Lied to or lying: either way, no run to run the schools.


1 Comment:

  • 1 nuff said
    · Mar 20, 2012 at 1:14 pm

    Now that the cat is out of the bag someone needs to pull the court transcripts to see if the DOE perjured itself claiming the data was valid with an acceptable MOE. I can’t imagine a judge okaying this knowing the MOE was as high as87%