Published Online: April 16, 2010
Teacher Training No Boon for Student Math Scores
By Debra Viadero
First-year findings from a federal study of 77 middle schools suggest that even intensive, state-of-the-art efforts to boost teachers’ skills on the job may not lead to significant gains in student achievement right away.
The "Middle School Mathematics Professional Development Impact Study," which was released April 6, is the second major experimental study by the U.S. Department of Education’s Institute of Education Sciences to find that a high-quality professional-development program failed to translate into any dramatic improvements in student learning. A two-year study of efforts to improve teachers’ instructional skills in early reading reached a similar conclusion in 2008.
“What accounts for this somewhat consistent pattern of results? We don’t really know,” said Michael S. Garet, a vice president at the American Institutes for Research. His Washington-based organization conducted both studies with the MDRC research group of New York City. “I think what we’re learning,” Mr. Garet added, “is that it’s challenging to make a big enough difference in teacher knowledge and instructional practice to have an impact on student learning.”
The results are already providing some intellectual ammunition for finding better ways to select and retain effective teachers—and shedding those who are ineffective—as the best way to improve instructional quality in schools.
The new study shows that “you can’t change teacher effectiveness very well with the tools that we have, and that you can’t change ineffective teachers into effective ones,” said Eric A. Hanushek, a senior fellow at the Hoover Institution, based at Stanford University. He is also the president of the IES advisory board, which heard a presentation on the new study’s findings last week.
But other scholars said it is too soon to issue a verdict on the effectiveness of professional development.
“We know teacher change takes time,” said Hilda Borko, an education professor who is also at Stanford. “The general belief is that it takes a while for teachers to take ownership of change and really incorporate change into their instruction.”
.....
Findings in Detail
By the end of the school year, the researchers found, teachers who participated knew slightly more about rational numbers, overall, than their control-group counterparts, but the effect was not statistically significant. Those slight improvements were most notable on a test of pedagogical content knowledge, where teachers stood a 55 percent chance of getting a question right, compared with a 50 percent chance for their counterparts in the control group. (In comparison, the trainers’ chance of answering a question correctly on that test was nearly 93 percent.)
On the plus side, the training did lead to changes in teaching practice. Compared with the control group, the teachers in the experimental group were more likely to try to draw out students’ thinking by asking students whether they agreed with a classmate’s response, or inviting them to share their mathematical strategies.
The changes in practice were not dramatic enough, however, to translate into student-learning gains on the computer-adaptive tests that students took in the spring. The students, most of whom came from schools where more than half of students qualified for federally subsidized school meals, continued to score, on average, at the 19th percentile on the tests, which were developed for the experiment by the Northwest Evaluation Association of Lake Oswego, Ore.
Mr. Garet said the researchers are now analyzing the results from a second year of similarly intensive teacher training in the same schools, which they hope to publish this year.
If those results show learning gains from the training, they might suggest the need for even more sustained professional development.
“You might need to have pretty intensive professional development all the time, every year, and then slowly get schools into a culture such that the expectation is that you always keep working on your knowledge,” said Sybilla Beckmann, a math professor at the University of Georgia in Athens and an adviser to the study.
The Case With Reading
The results from the 2008 reading study were based on two years of data, but just one year of professional development. That study also differed from the latest one in that it provided in-class coaching, randomly, to only about half the group in an effort to see whether that approach yielded any added benefits for teachers or their students. But that did not turn out to be the case.
The reading study also found that, while the skills of the 2nd graders in the study did not improve, there were some measurable gains in teachers’ knowledge.
For the middle school math study, Mr. Garet said, the researchers tested various possible explanations for why the training had failed to affect student achievement. They ruled out the hypothesis, for instance, that the tests were too hard or too easy for the teachers or their students.
“As you move away from the study itself,” Mr. Garet added, “one hypothesis worth testing would be to see what would happen if the professional development aligned with what teachers were evaluated on.”
Ms. Borko, who is leading her own study of teacher professional development, said it also would have been instructive if the federal studies had measured the quality of the instruction that teachers were getting.
Teacher Training No Boon for Student Math Scores
7 comments:
Considering that one of the programs that was used in the math study was CMP, how likely is it that any amount of PD would ever make up for poor curricula?
Did you see this footnote about CMP?
As noted earlier, the five day-long seminars were reordered in each district so that each seminar was scheduled when the topics covered by that seminar were being taught according to the district‘s curriculum pacing guide. For the three Pearson Achievement Solutions districts that used CMP, it was difficult to align the content of seminars 1 and 2 to primary topics in the curriculum. Although most of the units in the seventh-grade CMP curriculum units include fraction review problems, none of the units or lessons made fractions a primary focus. The content of seminars 3–5 was more closely aligned with the primary topics in other units, two of which focused in-depth on ratio, proportion, and percent.
Sorry, the reference for that is
page C -6 of the doc at
http://ies.ed.gov/ncee/pubs/20104009/pdf/20104009.pdf
but maybe it's just hopeless.
Note that in taking this approach, we implicitly assume that the relationship between the magnitude of the treatment impact and the baseline teacher knowledge test total score is linear.
This is amusing, because Wu often mentions how school math teachers often overlook making this assumption explicit when they give rate problems and ratio/proportion problems to their students.
Do we have any decent ways to measure the connections of the inputs and the outputs? Is the chain just so full of confounding variables we never get anywhere?
--During the first year of implementation, there was a statistically significant and positive impact of the PD program on the frequency with which teachers engaged in activities that elicited student thinking (effect size = 0.48). Treatment teachers on average engaged in 1.03 more activities per hour that elicited student thinking. On average, teachers in the treatment group engaged in such activities 3.45 times per hour, compared with 2.42 times per hour for teachers in the control group. (See Figure ES-2.)
So, it's statistically significant--but we went from 2 to 3 in an hour.
Does that make you want to weep?
I haven't read the study.
Just based on anecdotal evidence from my years of watching teachers with poor math skills attempting to teach a content light curriculum (EM), is it any shock that professional development can't compensate for the deficiencies?
If you want teachers that know something about math, maybe you should hire people that actually mastered the subject?
Meanwhile, at just about the same time, a study has come out showing that "professional learning communities" do work.
Which of course are far less expensive than 'intensive' or 'embedded' professional development.
(Will get a post up ----- )
Post a Comment