Larry Mantle |

Value Added Assessments and LAUSD

The Los Angeles Times has started a multi-part series on L. A. Unified, focusing on the divergent outcomes individual teachers get when working with the same pool of students.  The numbers are fascinating and in sharp contrast to what many educational observers claim.

One of the issues raised is how years of education and specialized training, coupled with classroom experience, affects student performance on standardized tests.  The Times analysis showed little correlation between those factors, which are touted as directly affecting the quality of teaching, and how well the kids did.

Monday morning on AirTalk, we spent an hour talking with co-author of the Times series, Jason Felch, and UTLA President A. J. Duffy, as well as listeners.  Duffy's, and several listeners', arguments included the point that using data in this way only further pushes teaching to the test, instead of holistic instruction.  That's an understandable concern. 

Duffy further thought there were serious shortcomings in analyzing the numbers, even related exclusively to test scores.  However, I didn't understand the argument for how the Value Added Assessment was deficient in that way.  I'm looking forward to reading more about it and better understanding such critiques.

If I were a teacher, I think I would share Duffy's concerns about even more emphasis on teaching to tests, but I'd also want to know whether my students, on average, are gaining or losing ground relative to their past performances and future ones.  Looking at only one student wouldn't tell you much, but averaged over time, a teacher could learn a great deal.

I found it interesting that a number of teachers commented on our AirTalk page about how the ups-and-downs of students, and the differences of their social and economic circumstances, make it impossible to use test results to measure how well teachers are helping students perform on those tests.  However, it appears that Value Added is able to avoid all that student-to-student variation by focusing on the arc of a student's academic career and what happens to that arc during a specific teacher's year with the student.

What do you think?  Is this a valid way to measure how well students are mastering tested subject matter?  We can leave the issue of whether we teach too much toward tests for another day.

I hope you're enjoying my weekly video roundup of our top stories.  My goal is to give you a sense of how we put the program together each day, and what listeners are saying about the hottest issues we cover.