'LA Times' Teacher Ratings Database Stirs Debate

The Los Angeles Times has promised to release the names of elementary teachers in Los Angeles, along with data showing how much their students improved on standardized tests. Reporters say the intent is to help parents measure teacher effectiveness, but the database has sparked a national debate on how to evaluate teachers.

Any day now, 6,000 elementary teachers in Los Angeles will see their names published online, along with data showing how much their students improved on standardized tests.

The Los Angeles Times has promised to release the information to help parents measure teacher effectiveness. The database has sparked a national debate on how to evaluate teachers.

The Times stories use some fancy number-crunching to compare the effectiveness of teachers in the nation's second-largest school system. It has led to an explosive controversy in the city -- some argue it's about time parents had an objective measure to compare their kids' teachers. Still, others say, this is just a way to humiliate educators.

A.J. Duffy, president of the United Teachers of Los Angeles (UTLA), says he is "outraged that The Times would put this out and put people in harm's way."

So far, the newspaper has only published analyses of individual teachers and schools. The union has tried to get out in front of the story, by publishing critiques of the method used to compare the scores. It's known as the "value-added method" because it compares test score gains or declines. Duffy says that "at this point, value-added is so flawed that I could not agree that it could be a useful tool."

The UTLA has received some support from researchers who agree that the value-added approach is open to error, and that those mistakes could lead parents and administrators to the wrong conclusions.

A number of papers over the past year have cautioned that there are just too many variables in student performance to rely heavily on student test scores.

Tim Sass of Florida State University says there are dozens of factors outside a teacher's control that can limit improvement in test scores. For example, Sass says, what if "the principal always gives the weaker students to the new teachers, let's say, and always favors the more senior teachers at their school?"

The L.A. Times analysis carefully tries to correct for that, and for many other variables. And, in fact, value-added analysis is a well-accepted approach that has been used for decades in education research, though not for the purpose of publicly comparing individual teachers.

William Sanders, one of the fathers of the value-added system, now works for a research company called SAS. He says that value-added analysis can accurately single out both star performers and ineffective teachers. But, Sanders cautions, "can you distinguish within the middle? No you can't, not even with the most distinguished, value-added process that you can bring to the problem."

And Sanders worries that parents may come to the wrong conclusions about those middle-performers.

The Los Angeles Unified School District admits it has been sitting on this data and hasn't used it to help teachers improve. Now, Deputy Superintendent John Deasy says, the real tragedy would be if all this information were ignored. "You would never evaluate a teacher only using this metric. On the other hand, you wouldn't evaluate a teacher completely [without] considering how students are doing over time in achievement."

Deasy says the controversy is now spurring the district to develop a new teacher evaluation system that uses value-added data.

Union President A.J. Duffy says slow down. In reference to the district's statement that they are moving forward to incorporate value-added data, Duffy says, "They can't. It has to be negotiated."

The two sides have agreed to sit down and talk. Copyright 2010 National Public Radio. To see more, visit http://www.npr.org/.

More in Education

Comments

blog comments powered by Disqus