Sunlight is said to be the best of disinfectants. Justice Brandeis' famous dictum for disclosure may have a limit: the public release of teacher performance data.
A new study from researchers Peter Bergman of Teachers College, Columbia University and Matthew J. Hill of RAND makes a cogent case for caution by detailing what happened to teachers and students after the 2010 publication of teacher value-added measure (VAM) data by the Los Angeles Times.
The researchers cleverly exploited a decision by the Times to only publish data for teachers who taught at least 60 students, in order to judge the impact of having one's performance data published for the entire world to see.
The idea is straightforward: Create two groups of teachers and compare what happened to them after the Times' 2010 publication: those slightly above the 60-student threshold (whose VAM scores were published) and those slightly below (whose scores were not published). Teachers within those groups should not differ—and the authors ran some checks to make certain this is so—except that some were published and some were not.
The result? High VAM teachers whose results were published were more likely to be assigned students who do better on state tests (likely because parents pressure schools for such assignments or principals proactively try to head off complaints from such parents).
Further, on average, the test scores produced by published teachers were no better or worse than what was produced by unpublished teachers. However, this average obscures a significant difference: after publication, the highest VAM teachers did worse while the lowest VAM teachers improved.
Recently, the Washington Teachers' Union, representing District of Columbia teachers, urged that the DC City Council not restrict public access to teacher evaluations. They should review the cautionary lessons of this study.