Published Online: April 16, 2010
Teacher Training No Boon for Student Math Scores
By Debra Viadero
First-year findings from a federal study of 77 middle schools suggest that even intensive, state-of-the-art efforts to boost teachers’ skills on the job may not lead to significant gains in student achievement right away.
The "Middle School Mathematics Professional Development Impact Study," which was released April 6, is the second major experimental study by the U.S. Department of Education’s Institute of Education Sciences to find that a high-quality professional-development program failed to translate into any dramatic improvements in student learning. A two-year study of efforts to improve teachers’ instructional skills in early reading reached a similar conclusion in 2008.
“What accounts for this somewhat consistent pattern of results? We don’t really know,” said Michael S. Garet, a vice president at the American Institutes for Research. His Washington-based organization conducted both studies with the MDRC research group of New York City. “I think what we’re learning,” Mr. Garet added, “is that it’s challenging to make a big enough difference in teacher knowledge and instructional practice to have an impact on student learning.”
The results are already providing some intellectual ammunition for finding better ways to select and retain effective teachers—and shedding those who are ineffective—as the best way to improve instructional quality in schools.
The new study shows that “you can’t change teacher effectiveness very well with the tools that we have, and that you can’t change ineffective teachers into effective ones,” said Eric A. Hanushek, a senior fellow at the Hoover Institution, based at Stanford University. He is also the president of the IES advisory board, which heard a presentation on the new study’s findings last week.
But other scholars said it is too soon to issue a verdict on the effectiveness of professional development.
“We know teacher change takes time,” said Hilda Borko, an education professor who is also at Stanford. “The general belief is that it takes a while for teachers to take ownership of change and really incorporate change into their instruction.”
Findings in Detail
By the end of the school year, the researchers found, teachers who participated knew slightly more about rational numbers, overall, than their control-group counterparts, but the effect was not statistically significant. Those slight improvements were most notable on a test of pedagogical content knowledge, where teachers stood a 55 percent chance of getting a question right, compared with a 50 percent chance for their counterparts in the control group. (In comparison, the trainers’ chance of answering a question correctly on that test was nearly 93 percent.)
On the plus side, the training did lead to changes in teaching practice. Compared with the control group, the teachers in the experimental group were more likely to try to draw out students’ thinking by asking students whether they agreed with a classmate’s response, or inviting them to share their mathematical strategies.
The changes in practice were not dramatic enough, however, to translate into student-learning gains on the computer-adaptive tests that students took in the spring. The students, most of whom came from schools where more than half of students qualified for federally subsidized school meals, continued to score, on average, at the 19th percentile on the tests, which were developed for the experiment by the Northwest Evaluation Association of Lake Oswego, Ore.
Mr. Garet said the researchers are now analyzing the results from a second year of similarly intensive teacher training in the same schools, which they hope to publish this year.
If those results show learning gains from the training, they might suggest the need for even more sustained professional development.
“You might need to have pretty intensive professional development all the time, every year, and then slowly get schools into a culture such that the expectation is that you always keep working on your knowledge,” said Sybilla Beckmann, a math professor at the University of Georgia in Athens and an adviser to the study.
The Case With Reading
The results from the 2008 reading study were based on two years of data, but just one year of professional development. That study also differed from the latest one in that it provided in-class coaching, randomly, to only about half the group in an effort to see whether that approach yielded any added benefits for teachers or their students. But that did not turn out to be the case.
The reading study also found that, while the skills of the 2nd graders in the study did not improve, there were some measurable gains in teachers’ knowledge.
For the middle school math study, Mr. Garet said, the researchers tested various possible explanations for why the training had failed to affect student achievement. They ruled out the hypothesis, for instance, that the tests were too hard or too easy for the teachers or their students.
“As you move away from the study itself,” Mr. Garet added, “one hypothesis worth testing would be to see what would happen if the professional development aligned with what teachers were evaluated on.”
Ms. Borko, who is leading her own study of teacher professional development, said it also would have been instructive if the federal studies had measured the quality of the instruction that teachers were getting.
Teacher Training No Boon for Student Math Scores