The evaluations of 44 Washington public school teachers — out of 4,000 — were compromised by an outside contractor, an email from a D.C. official to the city’s teachers union reveals.
The union is slamming the school system for the mistake and raising broader questions about the system.
D.C. Public Schools’ chief of human capital, Jason Kamras, wrote to Elizabeth Davis, president of the Washington Teachers’ Union, on Friday. In the email, which the school system provided to The Huffington Post, he told her of two errors in the evaluation of teachers for the 2012-13 school year.
First, Kamras wrote, the policy on appropriate weighting of administrator and master educator observations of teachers under the evaluation formula, known as IMPACT, was “not clearly communicated.” IMPACT scores can affect teachers’ bonuses and job security. So the District has recalculated all observation scores that might have been affected by that miscommunication, according to Kamras.
Second, he wrote, the outside contractor, Mathematica Policy Research, “found a small technical error” that affected some teachers’ Individual Value-Added scores. Those scores have been recalculated as well. Kamras assured Davis that teachers who would have had a lower IVA score as a result “will be held harmless.”
The teachers unions, both local and national, are calling attention to the errors, saying that they highlight inherent problems with these methods for sorting and evaluating teachers.
“These errors make clear that this evaluation system is flawed,” Davis said in a statement late Monday. “Teachers, parents and students deserve full transparency and accountability.” Davis also wrote to D.C. Public Schools Chancellor Kaya Henderson seeking more details on the extent of the errors.
Randi Weingarten, president of the American Federation of Teachers, chimed in with her own condemnation of the system. “There’s something very troubling when the district continues to reduce everything about students, educators and schools to a nameless, faceless algorithm and test score,” Weingarten wrote. “You can’t simply take a bunch of data, apply an algorithm, and use whatever pops out of a black box to judge teachers, students and our schools. And now, we have the disclosure that even the number was miscalculated, affecting dozens, if not hundreds, of educators. Our children deserve better.”
Michelle Rhee, then D.C. schools chancellor, instituted the use of IMPACT in 2009, making it one of the first teacher evaluation systems to treat students’ test scores as a significantly influential factor. The system uses “value-added measurement,” a complex algorithm that aims to remove the statistical effects of factors like students’ socioeconomic status to uncover how much teachers truly affect their students’ test scores. During its first year, IMPACT used value-added measurement to account for a full 50 percent of a teacher’s evaluation; that has since been reduced to 35 percent. IMPACT also considers administrators’ and master educators’ observations of the teachers.
When Rhee first announced that D.C. would be using IMPACT, it was a controversial decision: Many statisticians question the reliability of value-added metrics, and the use of standardized testing to separate good teachers from bad has always received vocal opposition from skeptics such as teachers unions. But as Rhee’s signature reform, the use of IMPACT has gained a broader symbolism for the so-called education reform movement, which has sought to emulate the program across the country.
A recent study of the IMPACT system found that it did help D.C. Public Schools keep successful teachers while losing its laggards. D.C. Public Schools advocates have also cited the city’s higher national test scores as evidence of the program’s success. But sociologist Matthew di Carlo of the Albert Shanker Institute, a think tank affiliated with the American Federation of Teachers, warned that the study’s conclusions shouldn’t be interpreted as an “overall assessment of IMPACT” because they only pertained to certain groups of D.C. teachers.
See Kamras’ full letter below:
I hope this message finds you well. I am writing to communicate two important updates regarding last year’s final IMPACT scores for teachers.
First, during the 2012-2013 school year, IMPACT policy was to calculate final TLF scores using a weighted average wherein administrator observations counted for 60% and master educator observations counted for 40%. Though this was the policy, it has been brought to my attention by several employees and by our legal counsel that the policy was not clearly communicated. Given our commitment to transparency, all final TLF scores that were lower because of the weighted average policy have been recalculated using a straight average. Later today, the IMPACT team will issue 2012-2013 revised reports for all teachers with higher final TLF scores as a result of the recalculation. For the 2013-2014 school year, all final TLF scores will be calculated using a straight average.
Second, our external partner Mathematica Policy Research recently found a small technical error that affected some teachers’ 2012-2013 Individual Value-Added (IVA) scores. Mathematica corrected the error and recalculated the value-added results. Later today, the IMPACT team will issue 2012-2013 revised reports for all teachers with higher IVA scores as a result of the recalculation. Teachers who would have had a lower IVA score as a result of the recalculation will be held harmless and will not be informed of the IVA recalculation.
If you or any of your teachers have any questions, please contact the IMPACT team …
As always, we thank you for your collaboration and partnership!