kitchen table math, the sequel: Common Core re-defines fluency

## Tuesday, August 21, 2012

### Common Core re-defines fluency

More from the Education Week article on Common Core assessments:
He pointed to one illustrative example in PARCC’s materials that tries to gauge students’ fluency in division and multiplication. It offers five equations, such as 54÷9=24÷6, and asks 3rd graders to specify whether each is true or false.

“I like that it does multiple assessments in one item,” he said. “It asks kids to work each of those problems easily and be comfortable with it, which is what fluency is.”
Published in Print: August 22, 2012
Consortia Provide Preview of Common Assessments
By Catherine Gewertz
Offhand, I don't see how a set of five equations like these can test fluency. Fluency isn't simply a matter of accuracy, ease, and comfort. Fluency includes speed, and to assess fluency you need fluency aims, or standardized rates of performance.

How fast should a third grade student be able to answer these 5 questions? That's what you would need to know to use these equations to assess fluency, and one set of 5 simple equations probably isn't enough to measure speed.

Beyond that, I'm skeptical this is a fluency test at all. Seems to me it's more an application-of-knowledge test than a test of fluency per se.

Here's the list of fluency aims Rick Kubina culled from the precision teaching literature.

Maybe I should send a copy to PARCC.

SteveH said...

The problem with combining skills together to find critical thinking is that if the student gets it wrong, you may not be able to work backwards to the exact cause. This happens in our current state tests. They are left with vague statistics on things like problem solving.

I was in a meeting long ago where teachers and parents were looking at the results of state testing that indicated that problem solving skills had gone down. Their answer was to tell teachers to spend more time on problem solving.

But how complex is the meaning of the given true/false equality? For those who have seen this type of problem before, they know what to do. There is no critical thinking involved. For those who have not seen a question posed this way, they could be flustered by what it means, especially if it is a timed test. The results can't separate out what's going on, and it surely isn't giving you any feedback on real critical thinking versus mastery of basic skills.

As for fluency, one should test simple times table skills independently. The results will give you a pretty clear indication of what to fix. Testing critical thinking is more difficult. It may just test whether a student has seen something before or not. Two or three step problems would be a better approach, but some educators feel that even that is too rote. They try to find non-ordinary ways of presenting problems that presume to test critical thinking, but may just test whether students have seen the problem before or not. That reminds me of SAT math.

Critical thinking is built upon mastery of basic skills, but many don't like tests that check basic (whether one, two, or three step) problems. Teachers in classrooms can better evaluate whether critical thinking and general problem solving are taking place. State testing is not the place to try to figure that out. It should just be a check to see if there are any basic problems at a school.

CCSS is not a vehicle to see if students are ready for STEM degree programs in college. It's having trouble just convincing colleges that passing something like PARCC means that they don't need remedial work.

SteveH said...

"Here's the list of fluency aims Rick Kubina culled from the precision teaching literature."