kitchen table math, the sequel

Thursday, November 10, 2011

“Fair Test” Procedures for SAT Da


After my terrible SAT experience last Saturday, I decided to look into whether or not any official rules had been broken.
Turns out there is an official SAT rule guide, The SAT Standard Testing Room Manual, which I think is worth reading before you take an SAT (especially Section A, which is only 11 pages long).
From the first paragraph:
"The SAT Program has established policies and procedures to ensure that all students can test under a uniform set of conditions .... All students are to be protected from disturbance. By strictly following our policies and procedures, you give students the best guarantee of fair testing."
At the time, I felt intimidated to say something to the proctor because I wasn't sure if official "rules" were broken, or whether they were "courtesies" he was forgoing.
And if I had trouble speaking up, I'd imagine it would be even more difficult for a teenager to do so -- especially if he or she isn't even sure about the official rules.
I did speak to the proctor at the first break and told him that lopping off five minutes of our time mid-way through a Reading Section really threw me -- and he responded by saying, "it was the lesser of two evils," which did not leave me inclined to speak up again, when the noise disturbances from other kids who had finished the test in the same gym became so loud that they echoed for our last 4 sections.
Turns out this proctor was wrong.  It was not "the lesser of two evils" to cut off five minutes of our time, mid-section.   In fact there there is an official rule in the manual for this exact situation: "Overtiming: Make no adjustment."
That was just the beginning of the broken rules last Saturday.....
1) The "Visible Clock" Rule:
I have experienced this "visible clock" issue a few times over the course of the 6 SATs I've taken this year (5 different locations). But, "lack of visibility" last Saturday was the least of my problems.
Start with the fact that the proctor inexplicably wrote the time down in the middle of the the Essay Section (after telling us before we started that he had no chalk to do so) -- but he didn't write it in our time zone time -- because, as he later explained to me when I asked, the (non-visible) clock turned out not to be in ourtime zone.
Fine, except that it confused me to see "a time" (but not our time) suddenly appear on the blackboard without explanation.
Also, there were no "regular" time warnings, as mentioned above in the manual -- I'd say they were more sporadic in nature (i.e. "2 minutes," or nothing at all....)

2) Desk Size (Avoid having a "deskette" experience):
To be fair, my deskette last Saturday probably did meet this "official standard" -- but honestly, as a test taker, that's too small for an optimal SAT experience.  12" by 15" holds ONE 8 x 11 test booklet  -- except that there are TWO booklets that need holding when you take the SAT (plus your calculator for math sections, and pencils).
Lack of proper desk space adds a juggle variable to the SAT experience that is distracting, time consuming, stressful, and noisy.  Try to find an SAT location with full desks.

3) Adult Test Takers:
I've experienced "assigned seating" once out of 6 SATs, and the fact of the matter is that I was assigned the front and center seat.  Not sure if that was a coincidence.

4) Timing and Breaks:
I believe this rule was carefully followed at every other SAT that I took this year, which is how I ended up lulled into complacency last Saturday.  I had grown toexpect this rule to be followed, and when it wasn't (starting in Section 3), I was thrown for such a loop I had trouble recovering.  Or maybe I was thrown off when the time mysteriously appeared on the board in a different time zone.  I don't know.  Either way, this "Timing Policy" wasn't followed and it affected me.

5) Reporting Irregularities:
I have no idea whether or not our proctor reported the "timing irregularities" that day.

6) Student Complaints: 
I'm not "a student," but I did have many of these same complaints.
I could continue on with these screen shots of broken rules from last weekend, but instead I'll reiterate that any SAT test taker should read pages 1-11 of  The SAT Standard Testing Room Manual before test day.

Cross Posted on Perfect Score Project


Wednesday, November 9, 2011

Does anyone want a STEM career anymore?

As the self-described 99% show the country what a wasteland a liberal arts education is, the current administration says STEM careers will transform (or is it save) America. But there's been a number of articles in the last few days about why American students today aren't choosing STEM careers.
in today's WSJ, Generation Jobless: Students Pick Easier Majors Despite Less Pay, the article opens with this:
Biyan Zhou wanted to major in engineering. Her mother and her academic adviser also wanted her to major in it, given the apparent career opportunities for engineers in a tough job market. But during her sophomore year at Carnegie Mellon University, Ms. Zhou switched her major from electrical and computer engineering to a double major in psychology and policy management. Workers who majored in psychology have median earnings that are $38,000 below those of computer engineering majors, according to an analysis of U.S. Census data by Georgetown University.

"My ability level was just not there," says Ms. Zhou of her decision. She now plans to look for jobs in public relations or human resources.


The NYT had this article, "Why Science Majors Change Their Minds (It’s Just So Darn Hard)" (pointed out by Glen a few days ago.)
it states:
Studies have found that roughly 40 percent of students planning engineering and science majors end up switching to other subjects or failing to get any degree. That increases to as much as 60 percent when pre-medical students, who typically have the strongest SAT scores and high school science preparation, are included, according to new data from the University of California at Los Angeles. That is twice the combined attrition rate of all other majors.


Why the attrition? Some are the usual suspects: from the WSJ piece again:

For 22-year-old Ms. Zhou, from Miami, the last straw was a project for one of her second-year courses that kept her and her partner in the lab well past midnight for several days. Their task was to program a soda machine. Though she and her partner managed to make it dispense the right items, they couldn't get it to give the correct change.


Such unpreparedness in part explains this (from NYT piece): "Ben Ost, a doctoral student at Cornell, found in a similar study that STEM students are both “pulled away” by high grades in their courses in other fields and “pushed out” by lower grades in their majors." But so does the burnout factor from the death march through calculus, as illustrated by "MATTHEW MONIZ bailed out of engineering at Notre Dame in the fall of his sophomore year...He had scored an 800 in math on the SAT and in the 700s in both reading and writing. He also had taken Calculus BC and five other Advanced Placement courses at a prep school in Washington, D.C., and had long planned to major in engineering....But as Mr. Moniz sat in his mechanics class in 2009, he realized he had already had enough. “I was trying to memorize equations, and engineering’s all about the application, which they really didn’t teach too well,” he says. “It was just like, ‘Do these practice problems, then you’re on your own.’

They quote Mitchell J. Chang, an education professor at U.C.L.A. who says it isn't just weak K-12 prep that causes this washout.
"You’d like to think that since these institutions are getting the best students, the students who go there would have the best chances to succeed,” he says. “But if you take two students who have the same high school grade-point average and SAT scores, and you put one in a highly selective school like Berkeley and the other in a school with lower average scores like Cal State, that Berkeley student is at least 13 percent less likely than the one at Cal State to finish a STEM degree.”


His argument seems to be that the kids at Cal are better prepared than the kids at CSU, so more of them should succeed, if the issue was really k-12 prep. I don't think that gets to the heart of the prep matter though. The kids at Cal, Notre Dame and the like are Used to Succeeding, and they aren't succeeding. This is a huge blow to them, at the same time that the intro courses are often seen as the drudgework to get to the electives, a point made in the NYT article. It feels better to get As in psych than B-s in EE.

Continuing,
Some new students do not have a good feel for how deeply technical engineering is. Other bright students may have breezed through high school without developing disciplined habits. By contrast, students in China and India focus relentlessly on math and science from an early age.

“We’re in a worldwide competition, and we’ve got to retain as many of our students as we can,” Dean Kilpatrick says. “But we’re not doing kids a favor if we’re not teaching them good life and study skills.


So many fall off. And what about the ones who make it?

You work harder for lower grades than your peers, and the payoff is either a) a career path where your employer is constantly lobbying the govt to drive down your pay by increasing immigration, or b) a career path where you front load all of your risk onto a low probability lottery ticket to the world of academia just as the higher ed bubble is bursting.

Not depressing enough? Read Arnold Kling's "What If Middle-Class Jobs Disappear?" Kling suggests high unemployment now is structural, coming from a new phase of an economic transition away from plentiful high paying white collar jobs, just as prior restructuring moved away from plentiful high paying blue collar jobs.
Using the latest Census Bureau data, Matthew Slaughter found that from 2000 to 2010 the real earnings of college graduates (with no advanced degree) fell by more in percentage terms than the earnings of high school graduates. In fact, over this period the only education category to show an increase in earnings was those with advanced degrees.
The outlook for mid-skill jobs would not appear to be bright. Communication technology and computer intelligence continue to improve, putting more occupations at risk.

For example, many people earn a living as drivers, including trucks and taxicabs. However, the age of driver-less vehicles appears to be moving closer.

Another example is in the field of education. In the fall of 2011, an experiment with an online course in artificial intelligence conducted by two Stanford professors drew tens of thousands of registrants. This increases the student-teacher ratio by a factor of close to a thousand. Imagine the number of teaching jobs that might be eliminated if this could be done for calculus, economics, chemistry, and so on.


So much for that lottery ticket to academia...but what about the private sector? Kling says:
...the main work consists of destroying someone else's job. Garett Jones has pointed out that the typical worker today does not produce widgets but instead builds organizational capital. The problem is that building organizational capital in one company serves to depreciate the organizational capital somewhere else. Blockbuster video adversely affected the capital of movie theaters, Netflix adversely affected the capital of Blockbuster, and the combination of faster Internet speeds and tablet devices may depreciate the organizational capital of Netflix.

The second challenge is the nature of the emerging skills mismatch. People who are self-directed and cognitively capable can keep adding to their advantages. People who lack those traits cannot simply be exhorted into obtaining them. The new jobs that emerge may not produce a middle class. Instead, if the trend documented by Autor for the period 1999-2007 were to continue, most of the new jobs would be low-end service jobs, for which competition will tend to keep wages low.


He goes on to posit some possible futures. He's not optimistic.

update: Kling link fixed. Thanks, ChemProf!

Tuesday, November 8, 2011

the other Chinese student

http://www.chinahush.com/2011/10/25/i-fought-for-18-years-to-have-a-cup-of-coffee-with-you/ (translation)

Here’s a question I pose for my white collar friends [in Shanghai]: what if I never graduated from middle school, and had become a migrant worker? Would you sit down for a cup of coffee with me at Starbucks? The answer, unequivocally, is that you wouldn’t
.

As you might know, college students from China's big cities (Shanghai, Beijing, Guangzhou, Wuhan...) apply to American colleges in a flood of high scores and intimidating talent, even socially. Typically, the quintessential undergraduate Chinese student I might meet at the University of Virginia might say, play Chopin and Debussy, have read Jane Austen's Northanger Abbey, eat twenty-dollar sushi dinners daily, and in general, spend money outrageously, yet he or she will be well-read on the the ancient revered Four Classics as well as Harry Potter.

There are those who dress fobbishly, but there are many who dress with both a kind-of-unique yet canned style borrowed from the latest trends in Shanghai or Tokyo or something. Personally, the kids from Wuhan come closest to my hipster sense of aesthetics, but the kids from Shanghai are far too materialistic, accentuate their class differences, and generally snub me. (And in my experience, Chinese parents from Shanghai are the prim-and-proper quantitative finance type who would only pursue an art for its social prestige and not for its own sake. But I am only stereotyping to represent the breadth and diversity of applicants.)

Yet despite all this progress, 900 million people in the rural provinces have been left in the dust. The opposite of affirmative action exists in China: not only are the rural primary and secondary schools of horrible quality, constantly understaffed and undersupplied, and rural students less likely to afford the piano lessons and the tutoring and art lessons that a kid in Shanghai might enjoy throughout his childhood -- when applying to college, a kid who somehow comes out on par with the those in the cities will still face discrimination on the sole fact that he comes from the provinces. Imagine that if China used the SAT, a promising kid from the provinces would need a score of 2250 to be placed over an average city resident with a score of 1850, despite the fact that the provinces' own averages are much lower.

In such a case, I wouldn't mind if our financial aid system granted some of these exceptional rural students a chance to enter the University, perhaps replacing just a handful (out of dozens) of the somewhat-conceited Shanghai students that the University might admit each year (yes just for Shanghai only, out of the thousands of students from China that apply to our school). Students from Shanghai, if rejected from our University, can go to some other college to make their future; the bright rural students that can't get admitted into college in their own country are stuck with nowhere else to go. Such a move would be Jeffersonian, after all.

Monday, November 7, 2011

the reluctant Machiavellian teacher

http://www.disciplinehelp.com/teacher/

117 behaviours, plus some pretty interesting analyses (registration required to read all of them but is free), and how to remedy them, and it seems pretty useful for newly-recruited TFA teachers shoved into a classroom and the like but have to deal with a few problem students who ruin it for everyone else eager to learn. The site breaks down each behaviour by "causes" or "needs" and suggests effective ways of remedy, and mistakes to avoid. In short, it suggests ways for a teacher to quickly gain control of a problematic situation, but in a subtle manner, without being unprofessional.

Teachers (or new talented ones at least) didn't join to be a Machiavellian, but perhaps for new teachers in some low-income districts it would be a necessary evil in order to be able to do the sort of thing they joined the corps for, infusing their students with passion and all of that. And many of the tips are rather insightful -- there are apparently promising ways to even make any troublemaker a potentially really productive student.

(-- and of course, I'm still a hopeful undergrad TFA applicant)

Sunday, November 6, 2011

wrong turn

from the Instructor's Notes to The Everyday Writer by Andrea Lunsford, Alyssa O'Brien, and Lisa Dresdner:
In his essay “Structure and Form in Non-Narrative Prose,” Richard Larson explains what he sees as the three categories of paragraph theory: paragraphs (1) as expanded sentences, governed by comparable syntactical forces; (2) as self-contained units of writing with their own unique principles; and (3) as parts of the overall discourse, informed by the strategies a writer chooses for the overall piece. 
Reading this passage, my reaction is: Interesting!

And: Help is on its way.

Any one of these theories of the paragraph sounds as if it might be very useful to me in teaching college freshmen how to write a 5-paragraph English paper.

Unfortunately, Larson is not findable on Google, and there's no more to be learned from Lunsford, who buries Larson and his ilk in her next paragraph:
Today, partially as a result of the poststructuralist and feminist critique, scholars are challenging conventional paragraph norms. In Marxism and the Philosophy of Language, V. N. Volosinov notes that “to say that a paragraph is supposed to consist of a complete thought amounts to absolutely nothing.” Beginning with this provocative insight, Kay Halasek’s book A Pedagogy of Possibility [the Amazon results, not the book] shows the ways in which composition textbooks have traditionally taught the paragraph in strictly traditional ways, as unified, coherent, and tightly linear. But Halasek works to redefine the paragraph as dialogic, as a negotiation among writer, audience, subject, and other textual elements that surround it. Most important, Halasek insists, is for instructors of writing to understand that the process of producing “unified,” “cohesive” paragraphs calls for ignoring, erasing, or otherwise smoothing out a diversity of discourses and voices. Thus teaching students to be aware of this process not only illuminates a great deal about how “good” paragraphs get constructed but also introduces them to a philosophy of language that is not based on current traditional positivism or objectivism.
Even if I agreed with the sentiments expressed in this paragraph, which I don't, the words "to say that a paragraph is supposed to consist of a complete thought amounts to absolutely nothing" tells me absolutely nothing about what to do in class next Tuesday.

Then there's this:
Today theorists are questioning the ideologies surrounding practices of quotation. An early critique of such practices appears in the work of Bakhtin and Volosinov, who question the ways in which quotation perpetuates a view of language as the property of a radically unique individual rather than as a set of socially constructed systems.
and this:
Volosinov, Valentin N. “Exposition of the Problem of Reported Speech.” Marxism and the Philosophy of Language. Cambridge: Harvard UP, 1973. Here Volosinov mounts a powerful critique of quotation practices, revealing the ways they are inevitably embedded in ideology.
The Everyday Writer costs $57.99 on Amazon.

headlines you don't want to see

Bridge deemed safe despite low rating
Rivertowns Enterprise Volume 36, Number 32 October 28, 2011

Ed was contemplating this front-page headline over breakfast the other morning.

He says what is means is the bridge is not safe, but they're hoping it doesn't collapse during the daytime.

Saturday, November 5, 2011

Glen on grammar teaching in the schools

Responding to the suggestion that native speakers should have no trouble with the grammar section on the SAT, Glen writes:
[T]here are native speakers and there are native speakers. My second grade son lost a point on an English assignment today for writing, "Anna gave my sister and me the book." The teacher crossed out "me" and replaced it with "I," explaining that the proper expression was, "my sister and I."
My son complained to me that, in his opinion, "gave me" was correct, not "gave I," so "gave my sister and me" should have been right. I congratulated him for his correct analysis, but told him not to mention it to his teacher, because nothing good would come of it.
She's a middle class, educated, native English speaker yet she, like most of us, could have benefited from some explicit grammar training. Her students would have benefited, too.
And, later, responding to the possibility that "prescriptivist" grammar instruction is responsible for sentences like the teacher's Anna sentence above:
Explicit, prescriptive grammar training doesn't cause bad grammar. Bad grammar by overcorrection is caused by untrained attempts to sound trained. While there may, of course, be limited examples of "a little learning is a dangerous thing" in grammar, the cure is mo' learnin', not no learnin'.
I'm with Glen. I don't see how we can blame prescriptive grammar classes for constructions like "President Obama and myself are concerned..." (which I'm pretty sure I saw Arne Duncan use in reference to public education). Public schools stopped teaching grammar "in isolation" decades ago, not too long after the 1963 Braddock report:
In view of the widespread agreement of research studies based upon many types of students and teachers, the conclusion can be stated in strong and unqualified terms: the teaching of formal grammar has a negligible or, because it usually displaces some instruction and practice in actual composition, even a harmful effect on the improvement of writing (p. 37).
Braddock, R., Lloyd-Jones, R., & Schoer, L. (1963). Research in written composition. Urbana, Il: National Council of Teachers of English.
The Braddock report was followed in 1986 by the Hillocks report:
None of the studies reviewed for the present report provides any support for teaching grammar as a means of improving composition skills. If schools insist upon teaching the identification of parts of speech, the parsing or diagraming of sentences, or other concepts of traditional grammar (as many still do), they cannot defend it as a means of improving the quality of writing" (Hillocks, 1986).
Although Hillocks and Smith claim that by as late as 1986 "many" schools were still teaching the parsing or diagraming of sentences, that is certainly not my experience, nor is it the experience of anyone else I know. Parts of speech, yes; parts of sentences, no. And as far as I'm concerned, the grammar of writing is the grammar of the sentence, full stop. If all you're teaching is noun, pronoun, adjective, and adverb, you're not teaching grammar, prescriptively or otherwise.

The Braddock and Hillocks passages are quoted liberally by education school professors and college composition instructors alike. There is a near-universal belief that teaching grammar "in isolation" is useless or bad or both, and today's K-12 teachers would have themselves been taught by teachers who held this view.

Related: The other night I asked a professor of English at one of the Ivies whether he knows grammar. I was curious.

He doesn't. He said that at some point -- the 1980s, possibly -- English departments had required graduate students to take courses in linguistics. Then that came to an end, and today English professors know literature but they don't know grammar or linguistics.

not dead yet: the case of whom

Ed, C., and I all three just got this question wrong:
Is Charles the student who / whom you know wants to go to the University of Chicago?
(Well, C. and I got it wrong. Ed is resisting the verdict.)

On the other hand, my friend Robyne, an attorney, got it right at once. (Got to, got to, got to take a legal writing course. pdf file)

Related: how to find the verb(s) in a sentence the way a linguist might find the verb(s) in a sentence:*
1. Change the tense of the sentence from present to past, or from past to present.
2. The word or words that change form (spelling) are the verb or verbs.
Traditional grammar books call the verb-like critters that don't have to change form when you move from present to past or past to present verbals, not verbs.
PRESENT TENSE: Hiding in the bushes, the cat watches its prey.
PRESENT TENSE: Hidden in the bushes, the catch watches its prey.
PAST TENSE: Hiding in the bushes, the cat watched its prey.
PAST TENSE: Hidden in the bushes, the cat watched its prey.
Watch and watches are the verbs; hiding and hidden are the verbals.

fyi: Linguists don't seem to use the term verbals, but I don't know what term they do use, if any.

and see:

THE COMING DEATH OF WHOM: PHOTO EVIDENCE


Meanwhile Mark Liberman says whom has been dead for a century.

From which I conclude that the word dead means something different to a linguist than it does to me. To me, a dead word is a word like ..... thee. Or thou. Or canstEre, oft, 'tis -- now those are dead words.

But whom is still with us. It is neither alive nor well, but it is definitely still kicking. Whom is a word no one has a clue how to use; we're all just closing our eyes and taking a flying leap when we pick it out of the Great Word Cloud inside our heads. Maybe linguists need a special category for words people no longer know how to use but are still using anyway.

(Maybe linguists already have such a category? Maybe the term for such words is dead?)

How about zombie words?

If you tell me whom has been a zombie word for a century, that I believe.

*I don't know how linguists find verbs in sentences. What I do know, what I have discovered over the past year, is that traditional grammar books are frequently wrong, or at least semi-wrong: traditional grammar books are wrong enough to be confusing and of little help when you're trying to tell an 18-year old how to find a verb in a sentence.

Andrew Gelman wants to know what to cut from h.s. math

Deadwood in the math curriculum

He doesn't seem to know about any of the changes to K-8 math in the past 20 years, and the blogger he quotes, Mark Palko, explicitly rejects the math wars as a "clever narrative" that does not explain high school math curricula.

It's tough fighting a math war nobody knows about.

In any event, some of you should probably weigh in on which topics to include in a sound high school math curriculum.

Thursday, November 3, 2011

Heat, but no light?

Due to TABOR, our elections were held this past Tuesday, instead of next week. The three candidates I supported for the School Board Election went down. (BTW-each of the winners received a significant amount of their campaign contributions from teachers' unions!) Three weeks ago, my neighbor had a community forum with some of the candidates. A little wine, cheese, and school topics chat with the neighbors.

I asked the candidates questions about the Common Core, school funding, and Educator Effectiveness plans in Colorado. I was a minority. Nobody seems to know what goes on in a school, these days. The main question many wanted answered had to do with the district start date. Not all schools in my district are air-conditioned and the kids get really hot in August. Really hot. Boy those little sweetpeas get hot. For about two weeks, it can be over 80 degrees in some classrooms. So yeah, their kids come home sweaty.

Of course start dates are driven by many things, one of which would be the mandated test dates in March and April. What school district chooses to start after Labor Day when the rest of the state starts mid August? Another start date driver is the desire to end first semester before the Christmas break. Which allows us to get out of school by Memorial Day, which allows high school students the ability to get summer jobs, take community college course over the summer, etc.

Back the the whine and cheese forum...
Since it's hot, the classrooms run fans. The fans make noise and it's hard to hear the teacher, so one candidate discussed having seen a teacher using a microphone (you know, like Brittney Spears) and the classroom had speakers in the ceiling. A school I've worked at in the past used such a system to accommodate hearing impaired students.  "Oooh", the parent who complained about the noisy fans said, "I don't want to take away from money that might be spent on technology and smartboards in the classroom for that"

So I guess, to many parents' minds, a smartboard trumps the ability to hear a teacher.

When I start to worry about education, I am grateful for our charter school. It's not perfect and we do get complaints, but none of them are about the heat.



Tuesday, November 1, 2011

math professor's experience coaching the SAT

The Effectiveness of SAT Coaching on Math SAT Scores
(pdf file) by Jack Kaplan
Chance Magazine | Volume 18, Number 2, 2005

Comment: Jack Kaplan's "A New Study of SAT Coaching"" (pdf file)
By Derek C. Briggs

Here's the Fairtest write-up of Kaplan's coaching results:
Kaplan's course includes 20 hours of classroom instruction and seven to 12 hours of individual tutoring. Of the 34 previously uncoached students he worked with over six summers, 21 increased their Math scores by more than 80 points from a baseline SAT administration with 11 going up by 100 points or more. Average score increases were higher for students with lower initial scores. In a control group Kaplan studied, average scores increased by only 15 points during the same period. Thus, he concludes that "the estimated coaching effect was 63 points" on the Math component of the SAT.
Kaplan was coaching the SAT math test being given prior to the 2006 changes.

the writing test and the math test

Chemprof (and others, I'm sure) pointed out in Comments that math/science professors value the SAT math test for the same reason I value the SAT writing test: both exams test standard mistakes that college students make.

Btw -- this is something I haven't gotten around to putting inside a post -- when I mention "the main errors student writers make," I'm referring to the Connor and Lunsford list of errors compiled in 1988, which is pretty close to the SAT list.

The Connor-Lunsford list is close to the SAT list except for the fact that Connor and Lunsford did not see ginormous numbers of parallel structure problems in the student papers they read, apparently. I find that hard to fathom. I personally do see ginormous numbers of non-parallel structures in the student writing that comes my way.

Faulty comparison, tested on the SAT, does not make the Connor-Lunsford list, either. (I'm not surprised by that.)

In any event, while musing about chemprof's observation (which I agree with, btw), an essential difference between the two tests, one that I hadn't focused on, suddenly leapt out at me: where SAT Math tests content and procedures students have been seeing in school for years,* SAT Writing tests content students have never seen or even heard tell of unless their Spanish teacher happened to explain what a gerund is in Spanish class.

(I use that example because I asked C. this week whether he knew what a gerund was, and he said he did because he'd learned it in Spanish. I myself had no idea what a gerund was until this semester. Public schools don't teach formal grammar today and haven't taught formal grammar in decades.)

So....when you think about it....isn't the Writing Test a bit of an odd concept?

Students have never been taught grammar, and now they're being tested on grammar?

And why would I be in favor of testing students on content the schools don't teach?

Now I'm thinking: well, maybe I'm not!

Mulling over chemprof's comment, I realize that what I value about the writing test is almost exclusively the test prep kids do for the writing test. The fact of the writing test, the fact that that the writing test exists and students have to take it, gives parents an excuse to insist their kids learn some formal grammar before they graduate high school.

And that's pretty much it; that's what I value about the test.

So, since high scores on writing come entirely from test prep (at least in my experience), what does a high score on the writing test actually mean? Does a high score on the writing section tell us anything about the student's writing?

I don't know the answer to that, and I don't have a good guess.

Basically, I think it's a good thing for a student to recognize a comma splice in an SAT sentence regardless of whether he recognizes a comma splice in his own writing, and effective SAT prep can make that happen. This is a statement of value: I value knowledge of comma splices, and I want my kid to possess it.

* Most of the content anyway. That's a subject for another post: these days the SAT now features counting problems, and students taking traditional algebra classes don't seem to have counting "units" in their courses (although chapters on counting  units are included in traditional texts). Ditto for the algebra 2 material on the SAT if a student has not taken algebra 2.

Monday, October 31, 2011

SAT math is tricky at all score levels

re: the a in f(x) = a - x2 (question below), which appears in the College Board's Online SAT Test 5 and is designated "medium difficulty"

I should have made clear in the post I wrote about the post Debbie wrote* that this question is not "tricky" for me. This question is easy for me, and the fact that this question is easy for me tells you nothing about whether I have mastery of quadratic equations and their graphical representations. At this point, I do not.

This question is easy for me because I have some basic understanding of shifts, because I have memorized the rules about shifts listed in all of the SAT test prep books (and most notably in Phillip Keller's book), and because I see the intended trick of the question the minute I look at it.

The fact that this question is easy for me probably does tell you I am pushing 700 on SAT math, which I am. I scored 680 on the real test, putting me at the 90th percentile for all test-takers; on sample sections my range is well into the 700s. I don't know the exact percentage of test-takers in the 90th percentile who get this question right and find it easy, but it's going to be very high. The top 10% of test takers is the group for whom this question is not tricky.

This question is tricky for test-takers scoring in the 500s, and the College Board knows it. They tell us they know it; they're not keeping it a secret. When the College Board assigns a "Medium" level of difficulty to an item, they are telling us that x number of kids scoring in the 500s will reliably get the question wrong, and x number of kids scoring in the 500s will reliably get the question right. That is the meaning of the words "medium level difficulty."

Kids scoring in the 500s are the ones you can depend upon to see a in f(x) = a - xand think, in a certain percentage of cases, a-the-coefficient-of-x2-in-ax2+bx+c=0. Those are the kids getting tripped up by this question.

Test-takers scoring in the mid-500s do not have mastery of quadratic equations and their graphical representations, and neither they nor anyone else is claiming they do. So when a designated percentage of kids scoring in the 500s get this question wrong, we learn nothing about them we did not already know. By the same token, when a designated percentage of kids scoring in the 500s get this question right, we also learn nothing about them. As a general rule, 500-scoring kids do not have mastery of quadratic equations and their graphical representations. I think that is a safe assumption to make.

Ditto for me. When I easily get this question right -- and, again, I stress the word easily -- no one knows anything about my level of mastery of quadratic equations and their graphical representations. I do not have mastery at present, yet for me this question is so easy it's a gimme. In Steve H's words,  I have (near)-mastery of the test, not the math.

So why is this question -- or, rather, questions just like it**-- on the test?

Questions like this are on the test because they have appeared in the experimental sections of previously-administered SAT tests and have been found to reliably produce a certain level of error. The College Board needs test items that produce reliable levels of test-taker error in order to keep raw scores stable from test to test. They can't have everyone getting all the questions right; they can't have everyone getting all the questions wrong; and, in the big, mooshy middle, they need items that reliably produce a certain percentage of right and wrong answers.

Hence the a-question.
* this is the mouse that lived in the house...
** I don't know whether items in Online Course have been tested in Experimental sections. As I understand it, all items in the tests actually administered to real test-takers have indeed been tested in experimental sections taken by other test-takers.

Sunday, October 30, 2011

the essay

C. was talking to his friend J. about their writing scores on the SAT.

After he hung up, he said, "I can't believe J. wrote the stupidest essay I've ever heard of and got a 9."

J.'s essay was about his family having to steal food in order to eat. Never happened.

"How come he wrote that?" I said.

"He spent too much time on the introduction and couldn't think of anything else to say."

Last year my neighbor's son got an 8 on an essay about why the essay question was stupid.

milestone

C. is applying to college today.

Assuming our electricity doesn't go out, that is. A small electrical fire from a fallen wire has been burning in the road next to our side yard since last night.

He wrote his what-I-did-with-my-summer-vacation answer on doing SAT math with his mom every day.

Susan S on her son's experience with competition math

re: the question of whether middle school competition math prepares students for ACT math:
I only have an anecdotal story concerning math contests and the ACT. When my son was taking the ACT in middle school at the beginning of the year, I hired a genius kid to tutor him on some of the things he had never experienced, like algebra 2 and some trig stuff. The tutor thought he would break 30, but he actually was in the middle 20s. Still not bad for a 13 yr. old.

As the year went on he made the Mathcounts team and had to practice the problems every week for a few months. He usually only finished half of them since we couldn't really help him. I signed him up for the actual Midwest Talent Search at the end of the year, but I didn't prep him this time or hire a tutor. His math score jumped to a 29 which placed him in the 99th percentile of the MTS, or Midwest Talent Search, kids and earned him a high scorer medal from Northwestern.

I have no idea if that means anything, but I thought it was interesting. He was only in accelerated algebra 1 at the time. Competition math just may help out in some way. It was the only thing I could think of at the time to explain the jump in scores.
If I had it to do over again, I would have had C. doing middle school competition math problems from the get-go -- and not just because SAT math is so closely related to middle school competition math.

C. and I both learned a tremendous amount prepping for SAT math. We learned a tremendous amount about algebra 1, geometry, and arithmetic, and we learned a tremendous amount about what we didn't know we didn't know.

puzzle math - from the comments

in an earlier Comments thread about SAT Math, gasstationwithoutpumps writes:
My son did no prep for the SAT when he took it in 6th grade and got a 720 on the math part. I'm quite sure that he'll do better on it when he takes it as high schooler next year without doing any SAT prep. The math questions just aren't all that hard.
to which SteveH responds:  
He had all of the material on the SAT by 6th grade? You are using this as normal?
gswp's reply:
No, he had not had all the SAT math. He'd had a very feeble algebra series in school (The Key to Algebra) and had self-taught geometry from a good book (he'd gotten through about half of Art of Problem Solving Geometry). The SAT supposedly tests much more than he had had at that point, but the questions are more like simple puzzles than testing content knowledge.
Gswp is exactly right! Setting aside the issue of the SAT using core features of cognitive function to produce error, which it does, SAT math is middle school puzzle math.

Saturday, October 29, 2011

an SAT math tutor on seeing the 'trick'

A friend forwarded me an email from a parent whose high school age child rapidly sussed out SAT math and scored somewhere in the neighborhood of 800 with minimal test prep. The dad said his child has what he sees as a kind of "insight." After reading through some SAT math tests, the child could instantly see the 'trick' involved in the questions (trick used in the nonpejorative sense of the word). The dad said, too, that he believes this form of insight isn't necessarily present in students earning physics degrees from Columbia.

An SAT math tutor I know agreed and, when I asked why, wrote this response and gave me permission to post:
I've interviewed LOTS of math/physics/computer science majors from NYU and Columbia for potential teaching gigs who couldn't make sense of relatively simple SAT problems. Almost universally, the issue is that they WAY overcomplicate things.
Steve H mentioned Richard Feynman's hobby practicing "divergent thinking" problems. I think that's probably what this dad (and this tutor) are talking about.

Blue Book

October 2, 2011
the Blue Book

Friday, October 28, 2011

update update update

Quick note: C., who has left his calculus course for statistics (thank you all for the advice), just told us the calculus teacher is giving his students an essay test. Just to review: this is a calculus class populated by kids who learned next to no precalculus last year. They're in calculus, they don't know any precalculus, and...they're taking an essay test.

Ed said, "You should have stayed in the class. You could have raised your grade."

That was a joke. We both know it's not true. If C. had stayed in the class, he would have done worse on the essay test than he did on the math test, I'll wager.

Must get Allison's Comment about charismatic teachers pulled up front.

C. loves statistics. Loves! First math class he's ever felt that way about.

What if you're already a lean, mean educating machine?

I sit on a school board of a charter school that had its per pupil revenue cut again last year. We're under $6k per pupil, and we operate well on less money than the district schools have. Our school gives merit increases instead of the traditional step-salary schedule and this is now the second year that we have been unable to give any raises, excepting what is mandated by the state for the teacher retirement program.

Until the state raises PPR, however, the chances of teachers getting raises are minimal.  We ask a lot of our teachers and they deliver. How much longer will they be willing to do so? The surrounding district is still paying scheduled salary increases.

However...we do have a little bit of cash and so voted to allot some of it to provide a bonus for the teachers that have helped the school through the last few years when we were growing. Mind you, the amount of the bonus ranged between $700 -$1500. Hardly seems worth their hard work, yet as a Board, we want to acknowledge their dedication.

Last week, this arrived in my inbox:
Dear BOD Members,

I want to personally thank you for the bonus I will receive on October’s paycheck, as well as the hard work you put in to make XXX Schools what they are.  I recognize that money is hard to come by from the culmination of our economic situation and school growth.  Please know that I feel XXX is a fantastic place to be, and I look forward to helping us grow to be excellent in everything that we do.

Thanks again for thinking of me.
OK, so it's only one teacher, but I bet that's more personal thanks than the district board gets.

Net Price Calculator - get estimated college costs for YOUR child

After running the newly mandated Net Price Calculator for a dozen colleges using a few different scenarios, I've concluded that this can be a useful step in the initial phase of the college search process.

You can see the numbers I generated over at Cost of College.

Wednesday, October 26, 2011

Teachers' unions, explained

Making the rounds of some charter schools.
Enjoy.


New York SAT cheating scandal is expected to lead to more arrests

A former FBI chief is coming in to help clean up the SAT cheating mess.
Gaston Caperton, president of the College Board and a former governor of West Virginia, said that in addition to bringing in the former F.B.I. chief, Louis J. Freeh, as a consultant, the College Board was also considering additional safeguards over the next year, including bolstering identification requirements for students taking the SAT and taking digital photographs to ensure they are who they say they are.
Some educators think this action is long overdue and are calling for harsher penalties.
“The procedures E.T.S. uses to give the test are grossly inadequate in terms of security,” Bernard Kaplan, principal of Great Neck North, testified at the hearing. “Furthermore, E.T.S.’s response when the inevitable cheating occurs is grossly inadequate. Very simply, E.T.S. has made it very easy to cheat, very difficult to get caught.”
While the new security measures represent a change of tone for College Board and Educational Testing Service officials who previously insisted their system was adequate, some superintendents and principals said they did not go far enough. These officials have called for fingerprinting students, increasing stipends for proctors and imposing real consequences on those who cheat. Currently, if the testing service suspects cheating, the students’ scores are canceled and they are permitted to retake the test — with no notification to either their high school or colleges where they apply.
Educational Testing Service is already spending about 10% of its budget on security for College Board testing, but whatever they're doing may not be adequate.  Bernard Kaplan, principal of the high school where the cheating occurred, says this about the problem.
“It is ridiculously easy to take the test for someone else” 
Many young people have fake IDs, which are commonly  "being bought in bulk from vendors in China" and "nearly undetectable by bar employees".  I imagine SAT test proctors also find it hard to spot them.

Related:  Student cheating – the SAT, the Internet, and Ted Kennedy

(Cross-posted at Cost of College)

Monday, October 24, 2011

Is the Pope Jewish, II: classroom technology and the children of technocrats

In a post I wrote just two weeks ago, I discussed two New York Times articles about the failure of technology in the classroom to raise test scores in Arizona, and the failure of one of the most acclaimed educational technologies, Cognitive Tutor, to raise test scores in general. I concluded by asking:
Will these recent exposés about the limitations of educational technology for subjects other than computer science have any effect whatsoever on the edtech bandwagon?...We might as well ask whether the Pope is Jewish.
Sure enough, just one week later (last Wednesday), yet another NY Times article on online education appears, this one discussing how the Munster Indiana school district has jumped on the bandwagon:
Laura Norman used to ask her seventh-grade scientists to take out their textbooks and flip to Page Such-and-Such. Now, she tells them to take out their laptops.

The day all have seen coming — traditional textbooks being replaced by interactive computer programs — arrived this year in this traditional, well-regarded school district.
Munster's technological revolution was particularly sudden:
Unlike the tentative, incremental steps of digital initiatives at many schools nationwide, Munster made an all-in leap in a few frenetic months — removing all math and science textbooks for its 2,600 students in grades 5 to 12, and providing a window into the hurdles and hiccups of such an overhaul.
But Munster isn't the first to go digital:
Schools in Mooresville, N.C., for example, started moving away from printed textbooks four years ago, and now 90 percent of their curriculum is online.
and:
Munster’s is part of a new wave of digital overhauls in the two dozen states that have historically required schools to choose textbooks from government-approved lists. Florida, Louisiana, Utah and West Virginia approved multimedia textbooks for the first time for the 2011-12 school year, and Indiana went so far as to scrap its textbook-approval process altogether, partly because, officials said, the definition of a textbook will only continue to fracture.
The cost? Munster has paid $1.1 million for infrastructure while parents pay an annual $150 rental fee for laptops, Schools in general: are "spending an estimated $2.2 billion on educational software."

The benefits? No efficacy data is cited, of course. Students, to some extent, get to work at their own rates. And then there's this:
Angela Bartolomeo’s sixth graders spent a recent Wednesday rearranging terms of equations on an interactive Smart Board and dragging-and-dropping answers in ways that chalkboards never could. (In between, a cartoon character exclaimed that “Multiplying by 1 does not change the value of a number!” in his best superhero baritone.)
And this:
Ms. Norman, the seventh-grade science teacher, is using material from Discovery Education, which on that Wednesday included videos from Discovery’s “Mythbuster” series (commercial-free), an interactive glossary and other eye candy to help students investigate whether cellphones cause cancer. When Ms. Norman told the students to take out their ear buds to watch a video, two in the back yelped, “Cool!”
And this:
“With a textbook, you can only read what’s on the pages — here you can click on things and watch videos,” said Patrick Wu, a seventh grader. “It’s more fun to use a keyboard than a pencil. And my grades are better because I’m focusing more.”
Whether students are focusing on the right things is another matter. And wouldn't it be nice if there were explanations, rather than exclamations, regarding what happens when you multiply a number by 1? The basic problem with computerized instruction, as I noted earlier, is that it almost never provides perspicuous feedback. Answers are either right, or wrong, and that's it.

Perhaps no one knows the limitations of computer software better than computer software experts. Where are these people sending their kids to school?  An article in this weekend's New York Times provides a glimpse. Focusing on Silicon valley, it describes how the chief technology officer of eBay, along with "employees of Silicon Valley giants like Google, Apple, Yahoo and Hewlett-Packard," are sending their children to the area's Waldorf school:
The school’s chief teaching tools are anything but high-tech: pens and paper, knitting needles and, occasionally, mud. Not a computer to be found. No screens at all. They are not allowed in the classroom, and the school even frowns on their use at home.
Noting that "three-quarters of the students here have parents with a strong high-tech connection," the Times observes:
Schools nationwide have rushed to supply their classrooms with computers, and many policy makers say it is foolish to do otherwise. But the contrarian point of view can be found at the epicenter of the tech economy, where some parents and educators have a message: computers and schools don’t mix.
The article quotes Waldorf parent Alan Eagle, who "holds a computer science degree from Dartmouth and works in executive communications at Google, where he has written speeches for the chairman," and who "uses an iPad and a smartphone:"
“I fundamentally reject the notion you need technology aids in grammar school... The idea that an app on an iPad can better teach my kids to read or do arithmetic, that’s ridiculous.”
Of course, there is one particular way in which computers could be highly effective teaching tools: for instruction in computer programming. But has there has been a rise, or a decline, in computer programming instruction in the decades since schools began jumping on the edtech bandwagon? We might as well ask whether the pope is Jewish.

(Cross-posted at Out In Left Field)

Sunday, October 23, 2011

Definitions, Precision, Coherence

What's missing from today's school math, and particularly, from middle school math?

Definitions, precision, and coherence.

Without a proper introduction to definitions, students don't get clarity in what a math statement is and what it is not. They don't get a sense of the abstractness of it. Without having actual definitions, they don't learn to work with definitions, so they can never learn how to use them to derive new things that are true of "all" of a given set. Without definitions, they cannot learn to REASON about mathematics because they have no basis for reasoning.

Without precision, students cannot make clear, unambiguous statements. They are not able to properly manipulate the symbols they are given, and they can't even say what their manipulations refer to and what they do not refer to. (As Wu says, In the same way that we do not ask “Is he six feet tall?” without saying who “he” is, we do not write down xyzrstuvw = a + 2b + 3cdefghijklmn
without first specifying what a, b, . . . , z stand for either (this is
an equality between what? Two random collections of symbols??
What does it mean??))

Without coherence, math is a series of unrelated facts designed merely to trick students. Without coherence, math is just a test of how big your working memory can be when you can never integrate any understanding together. There is no notion that the results you get follow from other results. Reasoning can't exist without coherence.

The reason SAT math feels so "tricky" to so many students is because they were never taught math in a coherent, reasoned way with definitions and precision. So the test seems to bejust a set of tricks designed to "catch" you, as opposed to a test of how what you know relates to other things you already know. That is why it seems to just test working memory. That is why it seems to have so much associative interference. The reason is because you were never taught that math was reasoned, with every piece of it following from the other pieces, a seamless whole that reinforced the same truths from a zillion different directions. And you were never taught what it meant.

summer boarding school and SAT prep

I talked to some friends I hadn't seen in awhile last night. They told me that over the summer they sent one of their kids to a summer boarding program where he prepared for the SAT.

His mom said he gained 60 points on reading, 90 points on writing, and 180 points on math.

I'm pretty sure this is the program: Wolfeboro The Summer Boarding School.

Steve H on SAT math and math prep

from the comments:
SAT math tries to trick students. You could say that the tricks relate directly to whether or not they really understand math. However, when you add in the time constraints, it really relates to preparation. Is preparation the same as mastery? Yes. Mastery of the test. Is this equivalent to mastery of math or whether you will do well in college math? Not necessarily. There are better ways of determining that than with the limited material included on the SAT. Why not just require students to take the Achievement Test? Look at the AP Calculus grade.

What is it about SAT-Math that is so important? They are trying to test something other than just math knowledge. They think that these tricky questions reflect on how well you think on your feet, but what it really does is test preparation and whether you have seen these questions before. The questions don't reflect on whether you have a wide body of knowledge and skills in math.

They create problems where you have to "see" the shortcut. You get problems with hidden 3-4-5 triangles. Add a time constraint and then what do you call those problems? It's not just about math knowledge and skills. The problem has to do with trying to determine the difference between aptitude and preparation. The tricks may have some basis in meaningful math, but that's not what they are trying to test.

It reminds me of questions companies like to ask at job interviews, like "Why is a manhole cover round", and "How many golf courses are there in the US.?" Preparation can make you look like you have a great aptitude. Preparation is directly related to math knowledge, and that is important, but identifying aptitude is an arms race for something like the SAT. That's causing the tricky problems, not any desire to test a breadth and depth of math knowledge.

In Dick Feynman's books, he talks about how he spent a lot of time in high school learning about all sorts of trick, lateral thinking problems. He would challenge people to ask him questions. There is nothing like preparation to make you look like a genius, although he really didn't need help with that. It really annoyed some of his colleagues.

My son will get to calculus in his junior year and he always gets A's. He still has to prepare for SAT-Math. He can't let others, with specific SAT-Math preparation, seem like they have a better aptitude than him.
and:
They try to trick students in most questions....What bothers me the most are the shortcut problems where using standard math techniques cause you to take too much time. This is supposed to identify aptitude, but it really tests preparation for the test.

There are also the problems where using a brute force or direct counting technique works better than any applied math technique. In some cases, there is no math to apply. One question on a sample PSAT test asked for the number of positive integers less than 1000 which don't have a '7' as one of the digits. (notice - "don't have" and positive integer) This simply checks how well you work under time pressure. Nobody expects you to apply any fancy math to this problem. One of the answers was the "have" solution. This tests preparation and practice, not aptitude or math ability. There may be a correlation between the test and aptitude or math ability, but not to the resolution colleges use it to select students. At the top levels, it correlates to preparation. That's not necessarily a bad thing, but there are better ways of figuring that out.

rat psych - "careless errors" in reading the SAT

During my year of living dangerously, doing SAT math prep off and on with C., I was chronically amazed stunned by the number and type of "careless errors" he and I both made taking timed sections of the test. In particular, I made repeated errors of "simple" reading, particularly when I was tired or the room was hot. I made so many reading errors that when I finally took the real test, I had no way to predict my math score at all: no way to estimate how many reading errors I had -- or had not -- made.

I eventually came up with a theory of careless errors, the details of which I've forgotten at the moment. I do recall that it had to do with working memory. Arguably the SAT tests working memory above all: all 10 sections put you into working memory blowout. I experienced working memory blowout so often that I began to notice a connection. As far as I can tell, you make more careless errors when your working memory is overtaxed (and you hit the limits of working memory much more quickly when you're sleep-deprived or overheated).

I've just come across a new study that I think confirms my subjective experience:
This study resolves two long-standing debates in the field. Does our working memory function like slots, and after our four slots [emphasis added] are filled with objects we cannot take in any more; or does it function like a pool that can accept more than four objects, but as the pool fills the information about each object gets thinner? And is the capacity limit a failure of perception, or of memory? [emphasis added]

“Our study shows that both the slot and pool models are true,” says Miller. “The two hemispheres of the visual brain work like slots, but within each slot, it’s a pool. We also found that the bottleneck is not in the remembering, it is in the perceiving.” [emphasis added] That is, when the capacity for each slot is exceeded, the information does not get encoded very well. The neural recordings showed information about the objects being lost even as the monkeys were viewing them, not later as they were remembering what they had seen.
Picower: 1 Skull + 2 Brains = 4 Objects in Mind
Failures of working memory are failures of perception!

Subjectively, that's what I experienced taking practice sections; that's what it felt like. Once I hit a certain level of tiredness, or heat, or working memory blow-out, I stopped being able to read.

The same thing happens on the reading and writing sections, too. The reading and writing sections are so taxing that you reach points where you simply cannot take in what the sentence or paragraph before you says. * I'm not talking about losing the ability to answer questions about the sentence or paragraph.

I'm talking about losing the ability just to read the words on the page.

I'm a 10
rat psych: what to do about SAT math (part 1)
rat psych: what to do about SAT math (part 2)
rat psych: what to do about SAT math (part 3)
rat psych: careless reading errors on the SAT

* I say "you" because I know I am not alone in this.

rat psych - what to do about SAT math (part 3)

Your typical high school student, I presume, has spent several years setting up equations and solving for x. At least, let's hope so. I certainly did.

The SAT uses this fact to elicit many wrong answers from test-takers who have worked a problem correctly. The student gets the solution right but the answer wrong because the answer isn't x. The answer is 3x, say, or xy. I seem to recall a problem or two where the answer was -x, for god's sake, but I might be making that up.

Other times the test will give you a value for x + y, say, and you're supposed to see that you should simply insert that value some place else in the problem, et voilà: the answer they're looking for pops up.

Here's a typical problem, medium difficulty (according to the College Board):
If 4(x + y)(x - y) = 40 and (x - y) = 20, what is the value of x + y?
A kid who's had no test prep at all will likely miss this question -- either miss it outright or take too much time spotting the solution, thus leaving him too little time to finish the test and increasing the likelihood he'll make "careless errors" on the questions he does get to because now he's working too fast trying to make up for the time he lost on the x + y problem.

For what it's worth, I think using x + y as the value, instead of x or y alone, is an interesting and instructive way to write a problem. (I'm curious what math people think). It seems to me that writing problems in which x + y is the salient unit may be a way of teaching what Ron Aharoni calls the fifth fundamental operation of arithmetic:
In addition to the four classical operations, there is a fifth one that is even more fundamental and important. That is, forming a unit, taking a part of the world and declaring it to be the “whole.” This operation is at the base of much of the mathematics of elementary school. First of all, in counting, when you have another such unit you say you have “two,” and so on. The operation of multiplication is based on taking a set, declaring that this is the unit, and repeating it. The concept of a fraction starts from having a whole, from which parts are taken. The decimal system is based on gathering tens of objects into one unit called a “10,” then recursively repeating it.

The forming of a unit, and the assigning of a name to it, is something that has to be learned and stressed explicitly. I met children who, in fifth grade, knew how to find a quarter of a class of 20, but had difficulty understanding how to find “three-quarters” of the class, having missed the stage of the corresponding process of repeating a unit in multiplication. What I Learned in Elementary School by Ron Aharoni
Maybe I'm wrong, but it seems to me that the x+y questions test math as opposed to obedience under pressure, which is what the Find xy questions test.

Still, there is no doubt in my mind that these questions elicit wrong answers from test takers who know the math involved, can do the math involved, and have a reasonable understanding of the math involved. Students who have spent years of their lives solving for x aren't going to break the Solve for x habit for the first time ever when they're working at breakneck speed and their eyes are bleeding from the Ella Baker passage.

Which brings me back to extinction learning. Test prep for SAT math involves spending a fair amount of time building new habits that conflict with ingrained old habits. You've been conditioned to solve for x; now you have to condition yourself not to solve for x. Also, you have to build as much speed as possible at not solving for x because you are never going to forget solve-for-x. The two impulses are inside your head, competing with each other, and the competition takes time (and probably eats up some precious working memory resources to boot).

Funny thing: during the time we spent doing SAT math prep around here, I overlearned don't solve for x to the degree that a couple of weeks before taking the real test I came across a practice problem that did ask the test-taker to solve for x. I was so surprised that I wasted several seconds reading and re-reading and re-reading again to make sure I hadn't misunderstood. You can't win.

For parents: your child needs to spend enough time not solving for x that he or she gets to be really, really fast at not solving for x.

Then he should be on the lookout for problems that say Solve for -x.


I'm a 10
rat psych: what to do about SAT math (part 1)
rat psych: what to do about SAT math (part 2)
rat psych: what to do about SAT math (part 3)
rat psych: careless reading errors on the SAT