kitchen table math, the sequel: Articles you may have missed...

Saturday, September 11, 2010

Articles you may have missed...

I did, while I was out of town.

From the New York Times:  Forget What You Know About Good Study Habits
 But individual learning is another matter, and psychologists have discovered that some of the most hallowed advice on study habits is flat wrong. For instance, many study skills courses insist that students find a specific place, a study room or a quiet corner of the library, to take their work. The research finds just the opposite.
And from the Washington Post, a little opinion on:  School reform's meager results
Standard theories don't explain this meager progress. Too few teachers? Not really. From 1970 to 2008, the student population increased 8 percent and the number of teachers rose 61 percent. The student-teacher ratio has fallen sharply, from 27-to-1 in 1955 to 15-to-1 in 2007. Are teachers paid too little? Perhaps, but that's not obvious. In 2008, the average teacher earned $53,230; two full-time teachers married to each other and making average pay would belong in the richest 20 percent of households (2008 qualifying income: $100,240).


ChemProf said...

The WaPo link needs fixing: it starts "tp" rather than "http".

CassyT said...

Thanks ChemProf - it's fixed now.

Bostonian said...

Drill, Baby, Dril by VIRGINIA HEFFERNAN, New York Times, September 16, 2010:

The word “drill” has come to define bad teaching. The piercing violence that “drilling” evokes just seems not to belong in sensitive pedagogy. Good teachers don’t fire off quiz questions and catechize kids about facts. They don’t plop students at computers to drill themselves on spelling or arithmetic. Drilling seems unimaginative and antisocial. It might even be harmful.

“In educational circles, sometimes the phrase ‘drill and kill’ is used, meaning that by drilling the student, you will kill his or her motivation to learn,” explains Daniel Willingham, a University of Virginia professor of psychology who has written extensively on learning and memory. “Drilling often conjures up images of late-19th-century schoolhouses, with students singsonging state capitals in unison without much comprehension of what they have ‘learned.’ ”

Oh, those schoolhouses — with the hickory sticks and the dunce caps. “Harrisburg! Salt Lake City! Montpelier! Tralalalala!” That does sound kind of fun — I mean, authoritarian. And drilling hardly has a better reputation outside academia. On message boards, students complain bitterly about Kumon, the extracurricular Japanese system of worksheet drills that many also admit has made them superb at math. Only unsportsmanlike parents hellbent on raising valedictorians, it seems, require their kids to do such rote work. At the same time, parents dismiss cutesy, flashy apps and Web sites that drill students using elaborate animation (like PopMath for arithmetic, iFlipr for custom flashcards, Cram for custom practice tests) as superficial edutainment, on par with children’s TV.

But while drilling might not look pretty — students doing drills don’t tend terrariums or don wigs to re-enact the Constitutional Convention — might it nonetheless be a useful way for some students to learn some things? By e-mail, E. D. Hirsch Jr., the distinguished literary critic and education reformer, told me that far from rejecting drilling, he considers “distributed practice,” the official term for drilling, essential. A distributed practice system, Hirsch explained, “is helpful in making the procedures second nature, which allows you to focus on the structural elements of the problem.”


SteveH said...

" ...“distributed practice,” the official term for drilling, ..."

When kids get a problem set in math, it's not drill and it's not distributed practice. Drill implies speed and Hirsch makes the comment that:

"A distributed practice system, Hirsch explained, “is helpful in making the procedures second nature, which allows you to focus on the structural elements of the problem.”"

A problem set is all about fully understanding a new topic. Distributed practice is important, but that's another issue. You can focus on "the structural elements of the problem" before achieving some high level of proficiency. This really only applies to previous knowledge and skills that have to be applied to the current problem set.

I think this is an important distinction. Many educators look at math textbook problem sets as drilling. This implies that they either don't understand what's going on or that they just don't like hard work. Perhaps, they like an indirect path where conceptual knowledge leads to mastery of skills and problem solving. It never happens. They confuse distributed practice with learning the material in the first place. That's the idea of Everyday Math. They don't expect you to learn it the first time. Distributed practice (via the spiral) is really just distributed learning.

I interpret "Drill and Kill" as claiming that there is another path to the same result. In reality, it just hides the fact that it's a different destination. Lower expectations.

concerned said...

Check out Report to the President on STEM education:

Barry Garelick said...

Yes, please check it out. Here are some excerpts:

Assessments...need to foster high-quality teaching rather than discourage it. Such assessments should measure higher levels of thinking and reasoning as well as students’ content knowledge and skills....[W]hen teachers aim to increase student scores on these assessments, they should foster all the types of learning that the standards emphasize – not merely the factual recall aspects of learning that are by far the easiest and least expensive to test. A good assessment encourages quality teaching and learning. This is no small feat given that excellence in STEM education means cultivating in students not simply the ability to answer predictable questions, but the capacity to pose probing questions and to figure out methods of answering those questions.

Most current assessments fail to meet these goals. Observers and educators note that they tend to over-represent low-level skills and factual recall, which may lead teachers to drill students on specific skills and facts they need for the test. Current tests often do a mediocre job of measuring the understanding and application of core concepts and principles, and they typically neglect the higher-order reasoning, problem-solving skills, and mathematical and scientific creativity that students need for college and for their careers. (p. 49)

[National Math Curriculum Standards That Emphasize Critical Thinking]

Many of the state-level standards emphasized low-level skills and large bodies of factual content rather than the high-level abilities and central concepts emphasized in the national standards. (p. 42)

The assumption in this report is that critical thinking skills can be taught. Procedural skills and memorization are given lip service and looked upon as low level stuff. Let's repeat a phrase from above: expose students to elegant concepts and patterns in mathematics so they can understand its beauty while also teaching them skills they need to apply those concepts,

"Expose" as in math appreciation. "Elegant concepts and patterns in mathematics". After all, math is all about patterns, according to the edu-world. "While also teaching them skills they need to apply those concepts". The "just in time" approach to teaching. We all know how well that works. See also this article for a refresher on the emphasis on "understanding" common core standards

Assessments that measure higher order thinking skills. Such assessments "should foster all the types of learning that the standards emphasize – not merely the factual recall aspects of learning that are by far the easiest and least expensive to test." So why is it that US students don't do as well on TIMSS as students in other countries if tests like TIMSS measure such low level skills. Oh, right, the students there are separated out in the early grades and are destined for college whereas in the US we educate everyone. Nice argument except that when you compare the top 10% in each country, our students still score below the Asian countries. Interestingly, even on the PISA exam which is rather fuzzy and consists of open-ended type questions, the students in countries that allegedly emphasize memorization and procedures seem to do better than US students. Well, I suppose there are answers to those questions that we'll hear about.

Among the people involved in this latest report are Deb Ball, Andy Isaacs and Glenda Lappan.

I notice no reference to the National Math Advisory Panel's report. Oh right; different administration--doesn't count.