kitchen table math, the sequel: 8/23/09 - 8/30/09

Thursday, August 27, 2009

a math class of 41 pupils

Ms. Sherry Tai, my young Singapore primary school math teacher (I think she was in her mid-20s) formed a personal bond with each one of us 41 students. 40+ students is a pretty typical class size at the primary and secondary school level. She was also our science teacher, our form teacher and our PE teacher, though plenty of other teachers taught our same class.

I remember her well because I was a "problem student" (partially because of my harebrained personality, partially because I had come from American middle school...) and she frequently confronted me about my performance.

What was her math class like? Ah, I remember sometimes she would be marking workbooks (and we'd be doing some other work, like practice problem sets), and she'd call people up individually about their work.

Being the problem student, I would be frequently called up of course. Being called up was annoying and sometimes intimidating because she'd be like, "why did you do this problem wrong?!" and you'd be like, "huh? I don't see what I did wrong?!" but then you'd be hushed with her explanation and sent back to your seat to redo the problem.

I'd feel sort of smug when the prefects and class monitor and monitress (and other model students) would get called up, and I wouldn't. Ahh, incentive for doing good work! As I got the hang of Singapore Math, I found myself being called up less and less. The dreaded 5-point word problems became less like monsters and more like delightful challenges I tackled with confidence. One thing that didn't seem to go away however, was my tendency to forget to bring some little thing to school (like say, a worksheet, an arts and crafts item required for that day, sometimes stationery, like correction pens...). A problem I suffered in American elementary school. A problem also suffered in college.

And oh yeah, correction pens. Let me tell you about those. In Singapore, stuff is often done with three sorts of pens. You need to bring a blue/black pen, a red pen, and a green pen to school. This colour requirement caused me endless grief initially (after moving from America) because I would have a knack of losing pens, or just not bringing all of them. But why this colour scheme?

Well, let's say you're doing a math workbook. You do the problem in blue/black ink. Your teacher (or your classmate) marks it in red ink. Sometimes, you do the marking for others, so you also need a red pen. If you get a problem wrong and receive it back, you're supposed to do the correction in green pen. (This scheme also readily applies to science and language work...) You do this for workbooks, worksheets, mock exams, real exams (after you get them back). You even do it for group work (dun dun dun). The colour scheme helps you keep track (and organise) about what you did wrong, and what you did right.

When I moved from America of course, I was not used to this system, so often I'd return marked workbooks without doing all of the required corrections, or I'd do the green pen corrections wrong, which usually resulted in a sharp call for "John Soong! Up here please!" every math class for the first few weeks. Yes, the corrections are supposed to include your working. (Didn't know that at first.) Yes, doing corrections in green ink for a 5-point problem you got 4 points off of was tedious.

Now, I actually have little idea what "rapid formative assessment" is supposed to mean rigourously. We had plenty of "assessment books" though, and as the PSLE approached, she made us buy additional assessment books on top of what the syllabus required, just so she could have the enjoyment of marking more of our work. Oh, and she would schedule remedial classes afterschool. For all 41 of us. ("Unless you had a 100 on your CA2 [no one did], you have to attend.") And by now, I'd faithfully do the problems, the working, the corrections (which became fewer) ... but forget things like the $2.50 I was supposed to bring to pay for the extra book, some parent's signature required for the remedial or the Edusave form. I think in one term (about two and a half months) I'd generally accumulate ten infractions.

None of the classwork was really graded. I can't remember the exact breakdown, but I think it's like the 2 CA exams account for like 30% of your grade and the two SA exams account for 70%. And the infractions ... well, other than being embarrassing, they aren't really life-changing. (What happened is that the group with the least amount of collective infractions at the end of the term won some sort of reward.) So I guess all those workbook problems were "formative assessments", sort of? Sometimes even one of the major exams (the CA1) does not have an impact on grade or has very low weighting (like 5%).

Does Singapore education have a cultural component? Prolly. But it's actually rather simple:

a) The work isn't graded. But teachers will nag you about it if you don't do it correctly. Incessantly. Even if you're only one out of 41 students.
b) It doesn't matter that the CA1 doesn't impact your final grade that much. It's just something you don't want to do badly on. Not only will the teachers nag you about bad results, so will your parents. And it goes on your report book. In fact your end-of-year grades don't affect your GPA, because GPA doesn't exist in primary and secondary school.

If you have a "bad home environment", your parents might not nag you about your performance, but there's still plenty of reprimand to face at school. In fact, my single mother almost never made me do my homework. My teachers did.

Monday, August 24, 2009

Math appreciation

One of the junior high schools in our district is an "international baccalaureate" school.

From the school's website:
International Baccalaureate Middle Years Program (IB-MYP)

The IB program has been in existence for several decades and has now reached over 2,000 schools and 124 countries.It is an internationally recognized and highly reputable program designed to fully engage students with an aim of creating a better and more peaceful world. It is catered specifically to the early puberty and mid-adolescent student. The MYP helps students develop the attitudes, life skills and knowledge necessary to participate fully in a growing and changing world.

It is vested in the ethics and values of young people, and its unique characteristics allow students to make connections between subjects, link what they learn to the real world, and reflect on their learning.


An explanation of the MYP (Middle Years Progam) Mathematics program follows. Emphasis mine.

MYP mathematics expects all students to appreciate the beauty and usefulness of mathematics as a remarkable cultural and intellectual legacy of humankind, and as a valuable instrument for social and economic change in society.

MYP Aims

The aims of any MYP subject and of the personal project state in a general way what the teacher may expect to teach or do and what the student may expect to experience or learn. In addition they suggest ways in which the teacher and the student may be changed by the learning experience.

The aims of teaching and learning mathematics are to encourage and enable students to:

recognize that mathematics permeates the world around us
appreciate the usefulness, power and beauty of mathematics
• enjoy mathematics and develop patience and persistence when solving problems
• understand and be able to use the language, symbols and notation of mathematics
• develop mathematical curiosity and use inductive and deductive reasoning when solving problems
• become confident in using mathematics to analyze and solve problems both in school and in real-life situations
• develop the knowledge, skills and attitudes necessary to pursue further studies in mathematics
• develop abstract, logical and critical thinking and the ability to reflect critically upon their work and the work of others
develop a critical appreciation of the use of information and communication technology in mathematics
appreciate the international dimension of mathematics and its multicultural and historical perspectives.

Sunday, August 23, 2009

Always worse than you think, Congressional version

It's always worse than you think. CPSIA, that is. The Consumer Product Safety Improvement Act, passed overwhelmingly by Congress, is doing damage to vast swaths of childrens' fashion and toy industries as it mandates, among other things, outrageous new testing and tracking by manufacturers, retailers, and resellers to prove products are in line with overly stringent lower limits on lead levels, phthlates, and other chemicals.

The law is so poorly written that it's affecting non-childrens' products in all sorts of industries. Of course, it's mostly small businesses that can't afford to come into compliance. The educational products industry is filled with such small vendors. Here's a quote from one in an AP story:
But some small businesses, like American Educational Products in Fort Collins, Colo. — it sells classroom teaching aids like flash cards, animal models, globes and relief maps — say the testing and labeling costs are crippling to their operations even though their products are safe. They want the law amended to exempt products that present little or no risk to young children.

"The challenge as a small business is that I cannot do it all (the testing) immediately," said Michael Warring, president of AMEP. "I would have to spend a full year of revenue to test every product I sell."

Warring recently laid off four of his 70 employees. In his 15 years with AMEP, he has not had one safety recall or complaint about lead.

Even so, Warring says he is required to test samplings of all products he makes and sells for young children, which he said costs about $2,000 per product. The tracking labels will add another cost, he says, since they must be a permanent marking on each product.
Another industry hit hard is the scientific equipment industry. Here are some stories as they affect science classrooms.

From the Amend The CPSIA blog,
Heathrow designs and manufactures items for use by trained laboratory technicians. ... Heathrow directly employs 13 individuals in Vernon Hills, Illinois and 1 in Great Britain. Heathrow recently received a request from one of its U.S. customers to certify that its products meet the standards set forth in the CPSIA.

Why, you may ask, would a company that designs, and manufactures, products for use by trained laboratory technicians, in professional labs, be asked to certify that its products meet standards set forth in a law that deals with safety standards for children’s products? The answer is that..this particular customer of Heathrow sells the Heathrow product range into the middle school science classroom marketplace...Therefore, they think they need to have on file certification from their suppliers that these products meet the CPSIA standards...Our products are not designed for use by children... if products are not designed for use by children, they are not subject to the CPSIA. However, many companies are spooked by the fact that this law has mandatory $100,000 per occurrence fines and felony criminal sanctions. They do not want to go to jail for selling products that violate the CPSIA, nor can they afford to risk $100,000 per occurrence fines.

So, they will either get their certifications or drop the products. This means that our products will no longer be available for use by middle school science teachers (who apparently found a use for them in teaching biology, chemistry and other sciences)
This isn't the only manufacturer that won't be passing CPSIA test for its microscopes. The solder on microscope light bulbs fails the lead test too. So
no microscopes at all.

But that's okay, you probably wouldn't have had anything to look at anyway.
From here:
"First, Michael Warring of American Educational Products reports that a school opted to stop using AmEP's rocks to teach Earth Science and will instead rely on a POSTER... The continued ragging of consumer groups about "toxic toys" sullies the reputation of all good companies and their good products. In this case, rocks take on the "toxic" tag because they contain uncontrollable amounts of base elements found in nature.


It gets worse. Nearly all science kits could fall because of the lead in the insulation on the wires, as they did in the case of the Potato Clock.

From the above blog again:
"recently a manufacturer of the Potato Clock decided to test its version for compliance with the newfangled CPSIA. In their eager beaver-ness, they shot themselves in the foot, discovering (horrors) that the insulation on the product's potato wires contain trace amounts of lead over the arbitrary limits of CPSIA...

First, the company decided that since it now knew of the test failure, it had an immediate reporting obligation under CPSIA Section 15(b). In addition, they concluded they had an obligation to immediately stop sale, since continuing to sell would be another "knowing" violation - yes, kids, that's a felony with possible penalties of jail time and asset forfeiture (goodbye house and car!)...

The CPSC, apparently, upon receiving this (unwanted) 15(b) report concurred - yep, the wire insulation exceeds the standard, and yep, you have to stop sale. No recall was required by the CPSC BUT the company appears to have decided almost immediately that an informal recall was mandated. Why might they have decided such a thing? Well, perhaps they had a generalized fear of liability from dealers who might be sued for selling this "dangerous" device if it ever came to light that the product had impermissible lead in the wire insulation....

But the WORST part of this story, the most chilling, is the part about the wire insulation. The Potato Clock was recalled for having too much lead in the wire insulation. Why did it have lead in it at all? Wire insulation contains lead because it is recycled vinyl, probably recovered principally from scrap of other wire...

The real problem comes from the fact that the Potato Clock utilizes "ordinary" wire. Everyone and everything utilizes "ordinary" wire. No specially-coated wire is used in children's products and even if it were available, it would be too expensive for this kind of application. Potato Clocks should use "ordinary" wire. If ordinary wire will always fail the CPSIA standards because of its insulation, then everything using wire in schools can't be sold for use by children under 13 years of age. This means, among other things, no electricity education before the 7th grade in this country (and only for the 13 year olds in the room - the 12 year olds will have to leave the room until their birthday)."

On the bright side, at least it will end discovery learning.

Bleg: Curriculum Ratings or reviews for elementary math or reading?

I've probably even asked this question before, and I know that some of this was on oldKTM, but, here goes again.

We know about TERC Investigations and Everyday Mathematics, but what about the lesser known curricula, especially for elementary and middle school (the stuff before algebra 1)? Have there be good write ups for the various products of McGraw Hill or Houghton Mifflin? Anyone with personal experience with any other curricula?

What about reading products? Are all products claiming to be balanced literacy the same? How about McGraw Hill's treasures? Other than SRA, who does reasonable phonics based instruction?

Education Evolution: From the pop quiz to formative assessment

So, let me make sure I have this right:

Schools now ask students to take tests that don't count on stuff they haven't seen in order to determine how to judge later how much those same students have learned whether or not the teacher really teaches it. They call this summative assessment. And they give tests that don't count on stuff that they may or may not have seen in order to determine if any of them have enough of the skills to skip those subjects and do enrichment instead, and to determine which of them don't have the skills to do the new material anyway. They call this formative assessment. They might also give tests that don't count on stuff they saw in the past to break up students into groups according to what they know. They call that placement.

These are all seen as separate concepts.

But they don't give students homework that they grade. And they don't give pop quizzes that they grade.

No common sense-y.

The schools have separated the incentive to learn the material from their assessment.

That's really astoundingly stupid.

It's as if the schools forgot that gee, the reason we gave homework and pop quizzes was to *encourage students to actually practice and therefore learn* the material. And lo, it had the benefit that if all of the kids did poorly, it meant the teacher hadn't taught it properly, and needed to do something different.

Course correction by the teacher can occur immediately if you poll the class immediately. And course correction by the student can occur immediately if you poll the student immediately.

These are truths so obvious I bet most parents have no idea that this doesn't happen.

But no, we've moved beyond silly homework and pop quizzes. Kids need to learn to be self directed. Teachers have material to get through! Damn the torpedos! Full speed ahead!

The other part that sticks out is how these assessments are all separate ideas to solve separate goal, when giving the kids homework and quizzes and remediation as needed solves the problems that the summative assessment and formative assessments are needed for.

Again: a linear progression of material over the year, where each unit builds on the last, along with homework and pop quizzes and test to immediately determine if yesterday's material was understood by the class before moving on.

Common Sense-y.

Ohio City, Idaho



Buy it here. (Yahoo Answers explains.)

pick one

Class size reduction or rapid formative assessment?: A comparison of cost-effectiveness

 Stuart S. Yeh, a, University of Minnesota, Educational Policy and Administration, 86 Pleasant Street, S.E., Minneapolis, MN 55455, United States Received 18 October 2007; revised 26 June 2008; accepted 25 September 2008. Available online 2 October 2008. 

Educational Research Review Volume 4, Issue 1, 2009, Pages 7-15

Abstract 

The cost-effectiveness of class size reduction (CSR) was compared with the cost-effectiveness of rapid formative assessment, a promising alternative for raising student achievement. Drawing upon existing meta-analyses of the effects of student–teacher ratio, evaluations of CSR in Tennessee, California, and Wisconsin, and RAND cost estimates, CSR was found to be 124 times less cost effective [emphasis added] than the implementation of systems that rapidly assess student progress in math and reading two to five times per week. Analysis of the results from California and Wisconsin suggest that the relative effectiveness of rapid formative assessment may be substantially underestimated. Further research regarding class size reduction is unlikely to be fruitful, and attention should be turned to rapid formative assessment and other more promising alternatives.

"the persistence of bad industry practices"

in the September Atlantic:

ALMOST TWO YEARS ago, my father was killed by a hospital-borne infection in the intensive-care unit of a well-regarded nonprofit hospital in New York City. Dad had just turned 83, and he had a variety of the ailments common to men of his age. But he was still working on the day he walked into the hospital with pneumonia. Within 36 hours, he had developed sepsis. Over the next five weeks in the ICU, a wave of secondary infections, also acquired in the hospital, overwhelmed his defenses. My dad became a statistic—merely one of the roughly 100,000 Americans whose deaths are caused or influenced by infections picked up in hospitals. One hundred thousand deaths: more than double the number of people killed in car crashes, five times the number killed in homicides, 20 times the total number of our armed forces killed in Iraq and Afghanistan. Another victim in a building American tragedy.

About a week after my father’s death, The New Yorker ran an article by Atul Gawande profiling the efforts of Dr. Peter Pronovost to reduce the incidence of fatal hospital-borne infections. Pronovost’s solution? A simple checklist of ICU protocols governing physician hand-washing and other basic sterilization procedures. Hospitals implementing Pronovost’s checklist had enjoyed almost instantaneous success, reducing hospital-infection rates by two-thirds within the first three months of its adoption. But many physicians rejected the checklist as an unnecessary and belittling bureaucratic intrusion, and many hospital executives were reluctant to push it on them. The story chronicled Pronovost’s travels around the country as he struggled to persuade hospitals to embrace his reform.

It was a heroic story, but to me, it was also deeply unsettling. How was it possible that Pronovost needed to beg hospitals to adopt an essentially cost-free idea that saved so many lives? Here’s an industry that loudly protests the high cost of liability insurance and the injustice of our tort system and yet needs extensive lobbying to embrace a simple technique to save up to 100,000 people.

by David Goldhill
Atlantic Monthly September 2009

The subject of formative assessment came up in the Comments thread for Allison's post about middle school math pre-tests.*

Sitting here in Evanston Hospital, keeping watch over my mom and reading the National Reading Panel Reports of the Subgroups in the lulls between crises,** I had a blinding flash of recognition when I read the passage above.

My blinding flash was not about hospital sterilization procedures and physician hand-washing, however. No. My blinding flash of recognition was all about formative assessment, (aka collect & correct).***

Ed and I have now been lobbying our school district to use formative assessment (pdf file) ("assessment for learning") for four years. Four years!

Moreover, the district has been paying formative assessment consultants to teach teachers how to use formative assessment for at least two years.

But do we have formative assessment?

No, we do not.

Not only do we not have formative assessment, we do have administrators sending FOILable emails to parents in which they openly refuse to provide formative assessment when directly asked to do so.

Formative assessment at its simplest means: find out what the kids learned, then reteach the stuff they didn't learn.

Can't get it; district won't do it. Or doesn't do it. Either one. I don't know any district that does.

And that is "deeply unsettling," as Goldhill says. How is it possible that parents have to beg schools to provide an essentially cost-free idea that stands a chance of getting kids to college ready to do college work?


update 1:49 Chicago:




* [news flash: at the moment, I can't log on to the old ktm site, which had a number of good posts about formative assessment. Usually when this happens the site eventually becomes available again, but we'll see. I really need to get it backed up, finally. Here's the URL for the search page: http://www.kitchentablemath.net/twiki/bin/view/Kitchen/CommentsSearch]
** Do you know the meaning of the words meaning troponin leakage? I do.
*** It's possible I have been writing an education blog for too long.

in Evanston

I'm in Evanston, still. My mom fell the night before we left for vacation, and has been in the hospital ever since. So have I. 

¿Dónde está Nevada?



Debating How Much Weed Killer Is Safe in Your Water Glass
Published: August 22, 2009


Also Wisconsin.

Where is Wisconsin these days?