Meet John Smith, who is apparently among the least effective elementary school teachers in all of L.A., according this weekend's story in the Los Angeles Times about variations in students' test scores based on which teacher they had.
The Times used an approach called "value-added analysis," which ranked 6,000 teachers based on how their students performed on standardized tests. They found huge disparities among teachers, often among those who work in close proximity to one another. And over the next few months, the Times plans to publish a searchable database analyzing the effectiveness of individual teachers.
The findings are worth taking a look at. Among them:
Contrary to popular belief, the best teachers were not concentrated in schools in the most affluent neighborhoods, nor were the weakest instructors bunched in poor areas. Rather, these teachers were scattered throughout the district. The quality of instruction typically varied far more within a school than between schools.\n
Although many parents fixate on picking the right school for their child, it matters far more which teacher the child gets. Teachers had three times as much influence on students' academic development as the school they attend. Yet parents have no access to objective information about individual instructors, and they often have little say in which teacher their child gets.
Since it was published, the Los Angeles Times reports that hundreds of comments have flooded in—including hundreds of teachers wanting to know their score.
In response, the 40,000-member L.A. teachers union organized a boycott, urging other labor groups to similarly cancel their newspaper subscriptions: "You're leading people in a dangerous direction, making it seem like you can judge the quality of a teacher by … a test," said A.J. Duffy, president of United Teachers Los Angeles.
Do you think this test-based approach should be used to evaluate teachers? And perhaps more importantly, should such rankings be made public?