Under the traditional curriculum why didn't mastery learning become the norm?
With the exception of the haphazard presentation of material is some of today's constructivist texts, traditional texts typically present a lesson, provide some practice, and then move on to the next topic in the sequence. Many students learned the material using this approach, but there was no effort to get the student to master the material and firmly place the material into the student's long term memory where it is somewhat protected against the ravages of forgetfulness. The inevitable result is that the student partially or fully forgets much of the material once the class moves on, unless the skills taught are used in subsequent lessons (like in elementary math). It was rare that we ever got a cumulative exam at the end of the year which was probably intentional because most of the students had forgotten the material taught in the first half of the year. This practice was mitigated to an extent by the fact that much of the material was retaught year after year--a precursor to today's spiral curriculum.
Nonetheless, this seems to be a horribly inefficient way of teaching to me. Yet it seems the have developed as the dominant form of (pre-constructivist) instruction by the latter half of the 20th century.
The question is why did it develop this way? Why not mastery learning?
Before you answer take a look at this blurb from Engelman's new book (pp. 30-31):
Mastery is essential for lower performers. Unless the practice children receive occurs over several lessons, lower performers will not retain information the way children from affluent backgrounds do. Prevailing misconceptions were (and are) that children benefit from instruction that exposes them to ideas without assuring that children actually learn what is being taught. If you present something new to advantaged children and they respond correctly on about 80 percent of the tasks or questions you present, their performance will almost always be above 80 percent at the beginning of the next session. In contrast, if you bring lower performers to an 80 percent level of mastery, they will almost always perform lower than 80 percent at the beginning of the next session.
The reason for this difference is that higher performers are able to remember what you told them and showed them. The material is less familiar to the lower performers, which means they can’t retain the details with the fidelity needed to successfully rehearse it. After at-risk children have had a lot of practice with the learning game, they become far more facile at remembering the details of what you showed them. When they reach this stage, they no longer need to be brought to such a rigid criterion of mastery. At first, however, their learning will be greatly retarded if they are not taught to a high level of mastery.
This trend was obvious in the teaching of formal operations. At first, the low- and high-performing groups were close in learning rate. Later, there were huge differences. Group 2 was able to learn at a much higher rate, largely because it was not necessary to bring them to a high level of mastery. On several occasions, I purposely taught the children in Group 2 to a low level of mastery (around 60 percent). I closed the work on the topic with one model of doing it the right way, and I assured the children that this was very difficult material. At the beginning of the next lesson, almost all of them had perfect mastery.
So, I think the answer to my question as to why mastery learning didn't become the norm is simply that it wasn't needed. Why go through all the effort of mastery learning when the higher-performers really didn't need it to learn? If the teacher is basing their performance on the feedback they are receiving from the successful students (only 60% mastery is needed), it's easy to see how one could reach the false conclusion that that's all the teaching a student needs to learn. And human nature being what it is, why teach more when less will do.
Nonetheless, I think we now know enough about how the brain works to know that retention of learned material is greatly enhanced when the learner engages in distributed practice after the initial mass practice. All students would benefit from distributed practice. So why haven't traditional educators changed their ways to offer more distributed practice?
I understand there is a philosophical objection to distributed practice (i.e., drill and kill)at the elementary school level. But what about at the secondary and post-secondary level where traditional education is still the norm? At this level, distributed practice just means giving a a couple of additional independent work problems that keeps previously taught material alive for the student until the material is better retained in long term memory. So why are classes at these levels still taught like the need for distributed practice doesn't exist?
Moreover, if the goal is to eradicate the worst practices of constructivist teaching, wouldn't it be beneficial to improve traditional teaching methods to incorporate techniques that will improve student performance? One of the reasons why constructivism has gained the foothold it has is due to the underperformance of the traditional curriculum, especially among lower-performers.