kitchen table math, the sequel: 10/30/11 - 11/6/11

Saturday, November 5, 2011

Glen on grammar teaching in the schools

Responding to the suggestion that native speakers should have no trouble with the grammar section on the SAT, Glen writes:
[T]here are native speakers and there are native speakers. My second grade son lost a point on an English assignment today for writing, "Anna gave my sister and me the book." The teacher crossed out "me" and replaced it with "I," explaining that the proper expression was, "my sister and I."
My son complained to me that, in his opinion, "gave me" was correct, not "gave I," so "gave my sister and me" should have been right. I congratulated him for his correct analysis, but told him not to mention it to his teacher, because nothing good would come of it.
She's a middle class, educated, native English speaker yet she, like most of us, could have benefited from some explicit grammar training. Her students would have benefited, too.
And, later, responding to the possibility that "prescriptivist" grammar instruction is responsible for sentences like the teacher's Anna sentence above:
Explicit, prescriptive grammar training doesn't cause bad grammar. Bad grammar by overcorrection is caused by untrained attempts to sound trained. While there may, of course, be limited examples of "a little learning is a dangerous thing" in grammar, the cure is mo' learnin', not no learnin'.
I'm with Glen. I don't see how we can blame prescriptive grammar classes for constructions like "President Obama and myself are concerned..." (which I'm pretty sure I saw Arne Duncan use in reference to public education). Public schools stopped teaching grammar "in isolation" decades ago, not too long after the 1963 Braddock report:
In view of the widespread agreement of research studies based upon many types of students and teachers, the conclusion can be stated in strong and unqualified terms: the teaching of formal grammar has a negligible or, because it usually displaces some instruction and practice in actual composition, even a harmful effect on the improvement of writing (p. 37).
Braddock, R., Lloyd-Jones, R., & Schoer, L. (1963). Research in written composition. Urbana, Il: National Council of Teachers of English.
The Braddock report was followed in 1986 by the Hillocks report:
None of the studies reviewed for the present report provides any support for teaching grammar as a means of improving composition skills. If schools insist upon teaching the identification of parts of speech, the parsing or diagraming of sentences, or other concepts of traditional grammar (as many still do), they cannot defend it as a means of improving the quality of writing" (Hillocks, 1986).
Although Hillocks and Smith claim that by as late as 1986 "many" schools were still teaching the parsing or diagraming of sentences, that is certainly not my experience, nor is it the experience of anyone else I know. Parts of speech, yes; parts of sentences, no. And as far as I'm concerned, the grammar of writing is the grammar of the sentence, full stop. If all you're teaching is noun, pronoun, adjective, and adverb, you're not teaching grammar, prescriptively or otherwise.

The Braddock and Hillocks passages are quoted liberally by education school professors and college composition instructors alike. There is a near-universal belief that teaching grammar "in isolation" is useless or bad or both, and today's K-12 teachers would have themselves been taught by teachers who held this view.

Related: The other night I asked a professor of English at one of the Ivies whether he knows grammar. I was curious.

He doesn't. He said that at some point -- the 1980s, possibly -- English departments had required graduate students to take courses in linguistics. Then that came to an end, and today English professors know literature but they don't know grammar or linguistics.

not dead yet: the case of whom

Ed, C., and I all three just got this question wrong:
Is Charles the student who / whom you know wants to go to the University of Chicago?
(Well, C. and I got it wrong. Ed is resisting the verdict.)

On the other hand, my friend Robyne, an attorney, got it right at once. (Got to, got to, got to take a legal writing course. pdf file)

Related: how to find the verb(s) in a sentence the way a linguist might find the verb(s) in a sentence:*
1. Change the tense of the sentence from present to past, or from past to present.
2. The word or words that change form (spelling) are the verb or verbs.
Traditional grammar books call the verb-like critters that don't have to change form when you move from present to past or past to present verbals, not verbs.
PRESENT TENSE: Hiding in the bushes, the cat watches its prey.
PRESENT TENSE: Hidden in the bushes, the catch watches its prey.
PAST TENSE: Hiding in the bushes, the cat watched its prey.
PAST TENSE: Hidden in the bushes, the cat watched its prey.
Watch and watches are the verbs; hiding and hidden are the verbals.

fyi: Linguists don't seem to use the term verbals, but I don't know what term they do use, if any.

and see:

THE COMING DEATH OF WHOM: PHOTO EVIDENCE


Meanwhile Mark Liberman says whom has been dead for a century.

From which I conclude that the word dead means something different to a linguist than it does to me. To me, a dead word is a word like ..... thee. Or thou. Or canstEre, oft, 'tis -- now those are dead words.

But whom is still with us. It is neither alive nor well, but it is definitely still kicking. Whom is a word no one has a clue how to use; we're all just closing our eyes and taking a flying leap when we pick it out of the Great Word Cloud inside our heads. Maybe linguists need a special category for words people no longer know how to use but are still using anyway.

(Maybe linguists already have such a category? Maybe the term for such words is dead?)

How about zombie words?

If you tell me whom has been a zombie word for a century, that I believe.

*I don't know how linguists find verbs in sentences. What I do know, what I have discovered over the past year, is that traditional grammar books are frequently wrong, or at least semi-wrong: traditional grammar books are wrong enough to be confusing and of little help when you're trying to tell an 18-year old how to find a verb in a sentence.

Andrew Gelman wants to know what to cut from h.s. math

Deadwood in the math curriculum

He doesn't seem to know about any of the changes to K-8 math in the past 20 years, and the blogger he quotes, Mark Palko, explicitly rejects the math wars as a "clever narrative" that does not explain high school math curricula.

It's tough fighting a math war nobody knows about.

In any event, some of you should probably weigh in on which topics to include in a sound high school math curriculum.

Thursday, November 3, 2011

Heat, but no light?

Due to TABOR, our elections were held this past Tuesday, instead of next week. The three candidates I supported for the School Board Election went down. (BTW-each of the winners received a significant amount of their campaign contributions from teachers' unions!) Three weeks ago, my neighbor had a community forum with some of the candidates. A little wine, cheese, and school topics chat with the neighbors.

I asked the candidates questions about the Common Core, school funding, and Educator Effectiveness plans in Colorado. I was a minority. Nobody seems to know what goes on in a school, these days. The main question many wanted answered had to do with the district start date. Not all schools in my district are air-conditioned and the kids get really hot in August. Really hot. Boy those little sweetpeas get hot. For about two weeks, it can be over 80 degrees in some classrooms. So yeah, their kids come home sweaty.

Of course start dates are driven by many things, one of which would be the mandated test dates in March and April. What school district chooses to start after Labor Day when the rest of the state starts mid August? Another start date driver is the desire to end first semester before the Christmas break. Which allows us to get out of school by Memorial Day, which allows high school students the ability to get summer jobs, take community college course over the summer, etc.

Back the the whine and cheese forum...
Since it's hot, the classrooms run fans. The fans make noise and it's hard to hear the teacher, so one candidate discussed having seen a teacher using a microphone (you know, like Brittney Spears) and the classroom had speakers in the ceiling. A school I've worked at in the past used such a system to accommodate hearing impaired students.  "Oooh", the parent who complained about the noisy fans said, "I don't want to take away from money that might be spent on technology and smartboards in the classroom for that"

So I guess, to many parents' minds, a smartboard trumps the ability to hear a teacher.

When I start to worry about education, I am grateful for our charter school. It's not perfect and we do get complaints, but none of them are about the heat.



Tuesday, November 1, 2011

math professor's experience coaching the SAT

The Effectiveness of SAT Coaching on Math SAT Scores
(pdf file) by Jack Kaplan
Chance Magazine | Volume 18, Number 2, 2005

Comment: Jack Kaplan's "A New Study of SAT Coaching"" (pdf file)
By Derek C. Briggs

Here's the Fairtest write-up of Kaplan's coaching results:
Kaplan's course includes 20 hours of classroom instruction and seven to 12 hours of individual tutoring. Of the 34 previously uncoached students he worked with over six summers, 21 increased their Math scores by more than 80 points from a baseline SAT administration with 11 going up by 100 points or more. Average score increases were higher for students with lower initial scores. In a control group Kaplan studied, average scores increased by only 15 points during the same period. Thus, he concludes that "the estimated coaching effect was 63 points" on the Math component of the SAT.
Kaplan was coaching the SAT math test being given prior to the 2006 changes.

the writing test and the math test

Chemprof (and others, I'm sure) pointed out in Comments that math/science professors value the SAT math test for the same reason I value the SAT writing test: both exams test standard mistakes that college students make.

Btw -- this is something I haven't gotten around to putting inside a post -- when I mention "the main errors student writers make," I'm referring to the Connor and Lunsford list of errors compiled in 1988, which is pretty close to the SAT list.

The Connor-Lunsford list is close to the SAT list except for the fact that Connor and Lunsford did not see ginormous numbers of parallel structure problems in the student papers they read, apparently. I find that hard to fathom. I personally do see ginormous numbers of non-parallel structures in the student writing that comes my way.

Faulty comparison, tested on the SAT, does not make the Connor-Lunsford list, either. (I'm not surprised by that.)

In any event, while musing about chemprof's observation (which I agree with, btw), an essential difference between the two tests, one that I hadn't focused on, suddenly leapt out at me: where SAT Math tests content and procedures students have been seeing in school for years,* SAT Writing tests content students have never seen or even heard tell of unless their Spanish teacher happened to explain what a gerund is in Spanish class.

(I use that example because I asked C. this week whether he knew what a gerund was, and he said he did because he'd learned it in Spanish. I myself had no idea what a gerund was until this semester. Public schools don't teach formal grammar today and haven't taught formal grammar in decades.)

So....when you think about it....isn't the Writing Test a bit of an odd concept?

Students have never been taught grammar, and now they're being tested on grammar?

And why would I be in favor of testing students on content the schools don't teach?

Now I'm thinking: well, maybe I'm not!

Mulling over chemprof's comment, I realize that what I value about the writing test is almost exclusively the test prep kids do for the writing test. The fact of the writing test, the fact that that the writing test exists and students have to take it, gives parents an excuse to insist their kids learn some formal grammar before they graduate high school.

And that's pretty much it; that's what I value about the test.

So, since high scores on writing come entirely from test prep (at least in my experience), what does a high score on the writing test actually mean? Does a high score on the writing section tell us anything about the student's writing?

I don't know the answer to that, and I don't have a good guess.

Basically, I think it's a good thing for a student to recognize a comma splice in an SAT sentence regardless of whether he recognizes a comma splice in his own writing, and effective SAT prep can make that happen. This is a statement of value: I value knowledge of comma splices, and I want my kid to possess it.

* Most of the content anyway. That's a subject for another post: these days the SAT now features counting problems, and students taking traditional algebra classes don't seem to have counting "units" in their courses (although chapters on counting  units are included in traditional texts). Ditto for the algebra 2 material on the SAT if a student has not taken algebra 2.

Monday, October 31, 2011

SAT math is tricky at all score levels

re: the a in f(x) = a - x2 (question below), which appears in the College Board's Online SAT Test 5 and is designated "medium difficulty"

I should have made clear in the post I wrote about the post Debbie wrote* that this question is not "tricky" for me. This question is easy for me, and the fact that this question is easy for me tells you nothing about whether I have mastery of quadratic equations and their graphical representations. At this point, I do not.

This question is easy for me because I have some basic understanding of shifts, because I have memorized the rules about shifts listed in all of the SAT test prep books (and most notably in Phillip Keller's book), and because I see the intended trick of the question the minute I look at it.

The fact that this question is easy for me probably does tell you I am pushing 700 on SAT math, which I am. I scored 680 on the real test, putting me at the 90th percentile for all test-takers; on sample sections my range is well into the 700s. I don't know the exact percentage of test-takers in the 90th percentile who get this question right and find it easy, but it's going to be very high. The top 10% of test takers is the group for whom this question is not tricky.

This question is tricky for test-takers scoring in the 500s, and the College Board knows it. They tell us they know it; they're not keeping it a secret. When the College Board assigns a "Medium" level of difficulty to an item, they are telling us that x number of kids scoring in the 500s will reliably get the question wrong, and x number of kids scoring in the 500s will reliably get the question right. That is the meaning of the words "medium level difficulty."

Kids scoring in the 500s are the ones you can depend upon to see a in f(x) = a - xand think, in a certain percentage of cases, a-the-coefficient-of-x2-in-ax2+bx+c=0. Those are the kids getting tripped up by this question.

Test-takers scoring in the mid-500s do not have mastery of quadratic equations and their graphical representations, and neither they nor anyone else is claiming they do. So when a designated percentage of kids scoring in the 500s get this question wrong, we learn nothing about them we did not already know. By the same token, when a designated percentage of kids scoring in the 500s get this question right, we also learn nothing about them. As a general rule, 500-scoring kids do not have mastery of quadratic equations and their graphical representations. I think that is a safe assumption to make.

Ditto for me. When I easily get this question right -- and, again, I stress the word easily -- no one knows anything about my level of mastery of quadratic equations and their graphical representations. I do not have mastery at present, yet for me this question is so easy it's a gimme. In Steve H's words,  I have (near)-mastery of the test, not the math.

So why is this question -- or, rather, questions just like it**-- on the test?

Questions like this are on the test because they have appeared in the experimental sections of previously-administered SAT tests and have been found to reliably produce a certain level of error. The College Board needs test items that produce reliable levels of test-taker error in order to keep raw scores stable from test to test. They can't have everyone getting all the questions right; they can't have everyone getting all the questions wrong; and, in the big, mooshy middle, they need items that reliably produce a certain percentage of right and wrong answers.

Hence the a-question.
* this is the mouse that lived in the house...
** I don't know whether items in Online Course have been tested in Experimental sections. As I understand it, all items in the tests actually administered to real test-takers have indeed been tested in experimental sections taken by other test-takers.

Sunday, October 30, 2011

the essay

C. was talking to his friend J. about their writing scores on the SAT.

After he hung up, he said, "I can't believe J. wrote the stupidest essay I've ever heard of and got a 9."

J.'s essay was about his family having to steal food in order to eat. Never happened.

"How come he wrote that?" I said.

"He spent too much time on the introduction and couldn't think of anything else to say."

Last year my neighbor's son got an 8 on an essay about why the essay question was stupid.

milestone

C. is applying to college today.

Assuming our electricity doesn't go out, that is. A small electrical fire from a fallen wire has been burning in the road next to our side yard since last night.

He wrote his what-I-did-with-my-summer-vacation answer on doing SAT math with his mom every day.

Susan S on her son's experience with competition math

re: the question of whether middle school competition math prepares students for ACT math:
I only have an anecdotal story concerning math contests and the ACT. When my son was taking the ACT in middle school at the beginning of the year, I hired a genius kid to tutor him on some of the things he had never experienced, like algebra 2 and some trig stuff. The tutor thought he would break 30, but he actually was in the middle 20s. Still not bad for a 13 yr. old.

As the year went on he made the Mathcounts team and had to practice the problems every week for a few months. He usually only finished half of them since we couldn't really help him. I signed him up for the actual Midwest Talent Search at the end of the year, but I didn't prep him this time or hire a tutor. His math score jumped to a 29 which placed him in the 99th percentile of the MTS, or Midwest Talent Search, kids and earned him a high scorer medal from Northwestern.

I have no idea if that means anything, but I thought it was interesting. He was only in accelerated algebra 1 at the time. Competition math just may help out in some way. It was the only thing I could think of at the time to explain the jump in scores.
If I had it to do over again, I would have had C. doing middle school competition math problems from the get-go -- and not just because SAT math is so closely related to middle school competition math.

C. and I both learned a tremendous amount prepping for SAT math. We learned a tremendous amount about algebra 1, geometry, and arithmetic, and we learned a tremendous amount about what we didn't know we didn't know.

puzzle math - from the comments

in an earlier Comments thread about SAT Math, gasstationwithoutpumps writes:
My son did no prep for the SAT when he took it in 6th grade and got a 720 on the math part. I'm quite sure that he'll do better on it when he takes it as high schooler next year without doing any SAT prep. The math questions just aren't all that hard.
to which SteveH responds:  
He had all of the material on the SAT by 6th grade? You are using this as normal?
gswp's reply:
No, he had not had all the SAT math. He'd had a very feeble algebra series in school (The Key to Algebra) and had self-taught geometry from a good book (he'd gotten through about half of Art of Problem Solving Geometry). The SAT supposedly tests much more than he had had at that point, but the questions are more like simple puzzles than testing content knowledge.
Gswp is exactly right! Setting aside the issue of the SAT using core features of cognitive function to produce error, which it does, SAT math is middle school puzzle math.