kitchen table math, the sequel: NYT's Writes "Fuzzy Maths" Obituary

Friday, April 25, 2008

NYT's Writes "Fuzzy Maths" Obituary

Study Suggests Math Teachers Scrap Balls and Slices - New York Times:

"In the experiment, the college students learned a simple but unfamiliar mathematical system, essentially a set of rules. Some learned the system through purely abstract symbols, and others learned it through concrete examples like combining liquids in measuring cups and tennis balls in a container.
Then the students were tested on a different situation — what they were told was a children’s game — that used the same math. “We told students you can use the knowledge you just acquired to figure out these rules of the game,” Dr. Kaminski said.
The students who learned the math abstractly did well with figuring out the rules of the game. Those who had learned through examples using measuring cups or tennis balls performed little better than might be expected if they were simply guessing. Students who were presented the abstract symbols after the concrete examples did better than those who learned only through cups or balls, but not as well as those who learned only the abstract symbols.
The problem with the real-world examples, Dr. Kaminski said, was that they obscured the underlying math, and students were not able to transfer their knowledge to new problems."


What more can you say? Abstract confusing word problems bad, underlying math good.

via Flypaper

Note: post edited. The use of the word abstract, throws me. And I know I haven't been actively posting, I have been working full time, raising five kids, and taking a full time course load. Alaska is cool though; we got 18 inches of snow today.

14 comments:

David said...

I read the article in Science, and I am not convinced by the study. They tried to teach the concept the integers modulo three, using concrete examples such as measuring cups, tennis balls, and slices of pizza, and they found this approach ineffective. But, does this really prove that an abstract approach is better, or does it only show that the concrete models that they chose were confusing?

TurbineGuy said...

"or does it only show that the concrete models that they chose were confusing?"

I'm going with concrete examples are confusing, cause seriously; who cares how far apart two trains are going in opposite directions?

Anonymous said...

But the subjects were not asked to solve word problems. They were taught an abstract mathematical system, namely addition modulo 3. The students who were taught the rules performed better than the students who learned through examples.

Not surprising. But the examples they used, such as combining liquids in measuring cups, seem contrived to me, and not clearly connected to the concept being taught. (Please excuse my alliteration.)

Catherine Johnson said...

Rory!

Hi!

10 inches of snow-----yowza!

(Haven't read the study yet - I've got a copy...)

Anonymous said...

Math Wars as Pedagogy

I teach math in a ‘fuzzy’ district (CMP and Investigations for K-8) and I struggle with this question every day. Lately, I’ve sort of elevated my reflection, out of the war or specific programs. Here’s where I am (today) and I’d like to throw it out there for discussion….

First, seperate teaching math into two domains. Domain 1 is content. Domain 2 is the delivery system or pedagogy. I think of it like UPS. Content is what’s in the truck and pedagogy is the delivery system.

What’s in my truck? It’s the content that I’m asked to teach by my state standards. I teach in Massachusetts which has widely admired standards (I’m not an admirer) that are spiraling in nature. I detest the spiral. It breaks up what is arguably humankind’s most coherant body of knowledge into arbitrary grade level pustules that cast a fog over a Mona Lisa.

Am I clear on how I feel about that? It has kids adding and counting to different numbers of significant digits. It has addition cut off at two digits in one year and picked up in a subsequent year to add one more digit. Really lame, if you’re trying to show the continuum.

This domain has little to do with the math wars but for one very significant exception.
Our standards are intimately entwined with our books. You might even say the relationship is too cozy. I know of places in the standards where the terms used are right out of the Investigations literature. The Terc/Cambridge/Boston thing might have some influence on this, just sayin’.

Anyway, what happens with this is that it’s really easy to mix up the fuzziness of an essentially inquiry based pedagogy with the fuzziness of content (in a spiral). There’s lots of ways to introduce fuzz and it helps me at least, to be sure of which type I’m dealing with.

When it comes to the pedagogy domain, I envision a continuum with the Socratic Method on one extreme and DI on the other. I use the term extreme advisedly, not pejoratively. The Socratic entails pure questioning, eliciting answers from students, and DI is pure answers delivered to the questioners. They are polar opposites.

The pedagogy of my fuzzy books is somewhere between the Socratic and DI. It sort of frames a high level question for students and then dumps it on them to ‘explore’ and come up with stuff on their own.

What usually occurs in the exploration is a kind of Socratic coaching on the fly as you race from group to group trying to avert meltdowns. Or sometimes, your exploration devolves into a DI session.

When I bring it all home I realize that there is no single delivery (domain 2) method that always works. I’ve had days where a purely Socratic approach went really well and others where a more DI approach went well. I also can’t remember too many days when working in the ‘middle’ didn’t ultimately find me moving to one end or the other after much wasted time.

Since my district is ‘failing’, teachers are straight jacketed into the fuzzy delivery of fuzzy content and going off the reservation is perilous.

I would really like to know the extent to which spirals have invaded the fuzzy texts. Are all the inquiry based/constructivist approaches also spiraling? Are they inseperable? Does this make any sense at all?

Anonymous said...

I'm very reluctant to conclude for a study on *college* students that abstract is good and concrete is bad for 1st graders.

I do think that there is way too much emphasis on concrete in the fuzzy math K-12 math courses, I just don't think that this study shows that.

Maybe we can conclude that college students deal well with abstractions...

-Mark Roulo

Anonymous said...

Our standards are intimately entwined with our books. You might even say the relationship is too cozy.

Same here in IL. How can any school break away when the standards line right up with the constructivist texts? How and when did that happen?

All of the constructivist texts that I've heard about involve a spiral. I think it has to do with not pressuring anyone to master anything at that exact moment. They can just get it next year when they spiral back around.

It also seems to help alleviate accountability where student achievement is concerned. You didn't fail that student, he's just on a different spiral.

Of course, when does the spiral end, or catch up, or whatever it's supposed to do? Middle schoolers who never learned basic arithmetic now have to learn things that require mastery of that arithmetic.

SusanS

VickyS said...

Susan: You ask how it happened that the standards line right up with the constructivist texts? I recall a statement on the official Everyday Math website years ago (and now gone) that stated outright that as state standards became more aligned with the EM curriculum, the scores of students using EM would go up. The implication was that the state standards needed to catch up with the curriculum, and I bet this was in no small part helped along by fuzzy math publishers.

I try not to pay much attention to state test scores because they can be so deceiving. If your state tests and your curricula are both fuzzy, the students score better than if there is a mismatch (good test, fuzzy curriculum; or fuzzy test, good curriculum). Unless you know for sure that your state standards are not fuzzy, AND your state test is not fuzzy (which it can still be, even if your state standards are not fuzzy), then how the kids are scoring doesn't tell you much.

VickyS said...

Paul: I would not characterize the Socratic method and DI as being polar opposites, because both are teacher-led instructional strategies. The Socratic method, with the teacher up at the front of the class, holding the whole class's attention, and moving them incrementally from one point to the next with carefully structured questions, can be extremely effective.

I think the endpoints of the delivery spectrum are better defined as, on the one hand, student-led, group-based discovery learning, and, on the other, whole-class, teacher led instruction.

I do like your distinction between the two domains: content and delivery system. I have had arguments with curriculum people who insist that textbooks like EM and TERC do not constrain a teacher's "delivery" or "instructional methods" but I beg to differ: they are all about delivery, and the fuzzy content and delivery methods are highly intertwined.

Anonymous said...

VickyS: Good point. It's just how I think about it. Your mileage may vary.

One of the things that shaped my analysis is that in my experience with student led instruction, it usually isn't student led. Maybe 10% of my kids have the requisite skills to thrive in this environment, i.e. they have enough dots in their ZPD to connect them.

For most kids, student led learning rapidly (within 5 minutes or so) becomes a decentralized Socratic tap dance where I'm running from 'student led' group to 'student led' group asking the probing questions to get them off the dime. This might get another 15% moving in the right direction.

Then at the 10 minute mark the remaining kids start to get DI (without the rigor). Combine this with my fuzzy curriculum (which BTW has way too many topics) that allows no time for practice and you get 6th graders that can't add, subtract, multiply, or divide.

VickyS said...

For most kids, student led learning rapidly (within 5 minutes or so) becomes a decentralized Socratic tap dance where I'm running from 'student led' group to 'student led' group asking the probing questions to get them off the dime.

Righto. And you're a good teacher to step in to do this tap dance! My sons' 5th and 6th grade math teacher just sat up at the front of the room, reading email during these student-led sessions

6th graders that can't add, subtract, multiply or divide are way too common.

I'm currently coaching the top 5th graders in a nearby school in a math competition, and more than half of them have not mastered the basic operations facts. They get stuck, really stuck, on evaluating expressions such as 6*9 without a calculator. Sad to say, the ones who can do it were homeschooled in the earlier grades.

Anonymous said...

I feel it's important to speak to kids at their height while they are working so I do my tap dance with my knees and the toes of my shoes. I go through a pair of shoes every year.

If you want to know if your math teacher is working, check out the shoes. If the tops are worn out, good news. If they aren't, probably lots of email and surfing goin' on.

Instructivist said...

"Since my district is ‘failing’, teachers are straight jacketed into the fuzzy delivery of fuzzy content and going off the reservation is perilous."

Isn't it perverse? Here in Chicago, failing schools (most of them) have constructivist math and science forced on them. It's someting called CMSI and mandated from above. http://www.cmsi.cps.k12.il.us/

Anonymous said...

One of the disturbing things about the constructivist curriculum is its penchant for refusing to take sides. It will develop multiple ways to accomplish something but never drives home the most efficient of the various alternatives.

I think developing multiple solutions is great but some are good as modeling tools, some are good for special cases, some are always efficient. These things need to be provided with some context else kids develop misconceptions by picking one they are comfortable with.

I spotted a student a few weeks ago who was repeatedly working on sums that came to 13. He used his fingers on each pass. I asked him if he knew the answer without using his fingers. He answered, "Sure, 13."

Then I asked why he was using his fingers if he knew the answer. "It's easier." I took that to mean more comfortable (than trusting their new knowledge). It occurs to me that nobody (student is in grade 5) ever set an expectation with this kid that he needed to commit to and trust his memory.

I haven't seen this guy using his fingers since that day.