kitchen table math, the sequel: Robert Pondiscio on curriculum vs value-added

Friday, September 3, 2010

Robert Pondiscio on curriculum vs value-added

When I think of the curriculum and teaching methods I was required to use in my classroom, the idea that my effectiveness might be dependent upon them makes me want to lie down with a damp wash cloth on my forehead. Manipulatives and discovery instead of basic arithmetic? Endlessly revising ”small moments” and teaching the writing process to 10-year olds instead of basic grammar? No time for even basic science, social studies because of district demands for ever larger math and literacy blocks? If it fails, it’s on me? Seriously?
Erin Johnson left a comment:
Robert, Why do you think that the LA Teachers Union (or the national unions) have not highlighted the issue of curricula?

I have recently been in contact with a LA teacher who was rated “more effective” in math by the LA Times. She states that her good rating was probably due to the fact that she “subversively” uses Saxon math instead of her district adopted program. Do ed reformers expect that teachers will subvert the curricula adoption process?

And here is Robert again:
I’m not sure curriculum reform is on anyone’s radar screen in a big way, including the unions. I used to regularly subvert…er…adapt my math curriculum to assure automaticity on basic functions. 5th graders counting on their fingers or multiplying with arrays is an offense to my sensibilities. I had less flexibility on ELA since there was lots of joint planning and execution involved. I’d go as far as saying my school’s ELA program (“It’s not a curriculum, Mr. Pondiscio, it’s a philosophy,” I can still hear the staff developer reminding me) is what turned me into a curriculum advocate.
Curriculum effects and value-added

4 comments:

dan dempsey said...

Let us Start Applying these Value Added Methods to Central Administration decisions. First candidate will be Seattle's May 2009 "Discovering Math" adoption.

Check it out HERE.

concerned said...

Curriculum materials are addressed in the recent Educational Policy Institute briefing paper - Problems with the Use of Student Test Scores to Evaluate Teachers

"These factors also include school conditions—such as the quality of curriculum materials, specialist or tutoring supports, class size, and other factors that affect learning."

Anonymous said...

I don't understand.


The regression done by RAND included a variable for curricula, as it did for several other features, including nonrandomness of assignment of students to teachers. The regression showed that these effects were accounted for, and were minimal--meaning, the statistical significance of what remained was measurable above a certain threshold.

So this isn't a problem if what you're measuring isn't the test scores themselves, but is the relative movement of students vs their prior scores--as opposed to measuring the students against each other.

Now, why would this be so, that curricula had so little effect? Because the curricula in LAUSD is district-wide. There is no variation across curricula, unless teachers are doing it on the sly. Everyday Math is for everyone.

So how is this a variable that creates a problem with the use of test scores to evaluate teachers?

I don't understand. In fact, in every single critique I've read of value-added measurement, it seems no one has read what RAND did.

Erin Johnson said...

Allison, In looking at the technical summary for the LA Times VAM, there is no variable for curricular differences, so I am not sure why anyone would assert that curricula was controlled for. http://www.latimes.com/media/acrobat/2010-08/55538493.pdf

There have been several studies that do demonstrate that curricula does have a significant effect on student learning. So given that different teachers may be using different materials in the classroom (and who knows if they do or do not because this has never been documented to my knowledge), how would the LA Times VAM ever differentiate between "teacher effects" and "curricular effects"? These two effects are most likely confounded in the teacher VAM study.

Also, the vast majority of critiques of VAMs have to do with using them on a year-to-year basis. It is not statistically valid to do so on a yearly basis. The VAM models are unstable on a yearly basis. Note that the LA Times used 7 years worth of data (with at least 60 observation points per teacher).