kitchen table math, the sequel: 10/23/11 - 10/30/11

Saturday, October 29, 2011

an SAT math tutor on seeing the 'trick'

A friend forwarded me an email from a parent whose high school age child rapidly sussed out SAT math and scored somewhere in the neighborhood of 800 with minimal test prep. The dad said his child has what he sees as a kind of "insight." After reading through some SAT math tests, the child could instantly see the 'trick' involved in the questions (trick used in the nonpejorative sense of the word). The dad said, too, that he believes this form of insight isn't necessarily present in students earning physics degrees from Columbia.

An SAT math tutor I know agreed and, when I asked why, wrote this response and gave me permission to post:
I've interviewed LOTS of math/physics/computer science majors from NYU and Columbia for potential teaching gigs who couldn't make sense of relatively simple SAT problems. Almost universally, the issue is that they WAY overcomplicate things.
Steve H mentioned Richard Feynman's hobby practicing "divergent thinking" problems. I think that's probably what this dad (and this tutor) are talking about.

Blue Book

October 2, 2011
the Blue Book

Friday, October 28, 2011

update update update

Quick note: C., who has left his calculus course for statistics (thank you all for the advice), just told us the calculus teacher is giving his students an essay test. Just to review: this is a calculus class populated by kids who learned next to no precalculus last year. They're in calculus, they don't know any precalculus, and...they're taking an essay test.

Ed said, "You should have stayed in the class. You could have raised your grade."

That was a joke. We both know it's not true. If C. had stayed in the class, he would have done worse on the essay test than he did on the math test, I'll wager.

Must get Allison's Comment about charismatic teachers pulled up front.

C. loves statistics. Loves! First math class he's ever felt that way about.

What if you're already a lean, mean educating machine?

I sit on a school board of a charter school that had its per pupil revenue cut again last year. We're under $6k per pupil, and we operate well on less money than the district schools have. Our school gives merit increases instead of the traditional step-salary schedule and this is now the second year that we have been unable to give any raises, excepting what is mandated by the state for the teacher retirement program.

Until the state raises PPR, however, the chances of teachers getting raises are minimal.  We ask a lot of our teachers and they deliver. How much longer will they be willing to do so? The surrounding district is still paying scheduled salary increases.

However...we do have a little bit of cash and so voted to allot some of it to provide a bonus for the teachers that have helped the school through the last few years when we were growing. Mind you, the amount of the bonus ranged between $700 -$1500. Hardly seems worth their hard work, yet as a Board, we want to acknowledge their dedication.

Last week, this arrived in my inbox:
Dear BOD Members,

I want to personally thank you for the bonus I will receive on October’s paycheck, as well as the hard work you put in to make XXX Schools what they are.  I recognize that money is hard to come by from the culmination of our economic situation and school growth.  Please know that I feel XXX is a fantastic place to be, and I look forward to helping us grow to be excellent in everything that we do.

Thanks again for thinking of me.
OK, so it's only one teacher, but I bet that's more personal thanks than the district board gets.

Net Price Calculator - get estimated college costs for YOUR child

After running the newly mandated Net Price Calculator for a dozen colleges using a few different scenarios, I've concluded that this can be a useful step in the initial phase of the college search process.

You can see the numbers I generated over at Cost of College.

Wednesday, October 26, 2011

Teachers' unions, explained

Making the rounds of some charter schools.

New York SAT cheating scandal is expected to lead to more arrests

A former FBI chief is coming in to help clean up the SAT cheating mess.
Gaston Caperton, president of the College Board and a former governor of West Virginia, said that in addition to bringing in the former F.B.I. chief, Louis J. Freeh, as a consultant, the College Board was also considering additional safeguards over the next year, including bolstering identification requirements for students taking the SAT and taking digital photographs to ensure they are who they say they are.
Some educators think this action is long overdue and are calling for harsher penalties.
“The procedures E.T.S. uses to give the test are grossly inadequate in terms of security,” Bernard Kaplan, principal of Great Neck North, testified at the hearing. “Furthermore, E.T.S.’s response when the inevitable cheating occurs is grossly inadequate. Very simply, E.T.S. has made it very easy to cheat, very difficult to get caught.”
While the new security measures represent a change of tone for College Board and Educational Testing Service officials who previously insisted their system was adequate, some superintendents and principals said they did not go far enough. These officials have called for fingerprinting students, increasing stipends for proctors and imposing real consequences on those who cheat. Currently, if the testing service suspects cheating, the students’ scores are canceled and they are permitted to retake the test — with no notification to either their high school or colleges where they apply.
Educational Testing Service is already spending about 10% of its budget on security for College Board testing, but whatever they're doing may not be adequate.  Bernard Kaplan, principal of the high school where the cheating occurred, says this about the problem.
“It is ridiculously easy to take the test for someone else” 
Many young people have fake IDs, which are commonly  "being bought in bulk from vendors in China" and "nearly undetectable by bar employees".  I imagine SAT test proctors also find it hard to spot them.

Related:  Student cheating – the SAT, the Internet, and Ted Kennedy

(Cross-posted at Cost of College)

Monday, October 24, 2011

Is the Pope Jewish, II: classroom technology and the children of technocrats

In a post I wrote just two weeks ago, I discussed two New York Times articles about the failure of technology in the classroom to raise test scores in Arizona, and the failure of one of the most acclaimed educational technologies, Cognitive Tutor, to raise test scores in general. I concluded by asking:
Will these recent exposés about the limitations of educational technology for subjects other than computer science have any effect whatsoever on the edtech bandwagon?...We might as well ask whether the Pope is Jewish.
Sure enough, just one week later (last Wednesday), yet another NY Times article on online education appears, this one discussing how the Munster Indiana school district has jumped on the bandwagon:
Laura Norman used to ask her seventh-grade scientists to take out their textbooks and flip to Page Such-and-Such. Now, she tells them to take out their laptops.

The day all have seen coming — traditional textbooks being replaced by interactive computer programs — arrived this year in this traditional, well-regarded school district.
Munster's technological revolution was particularly sudden:
Unlike the tentative, incremental steps of digital initiatives at many schools nationwide, Munster made an all-in leap in a few frenetic months — removing all math and science textbooks for its 2,600 students in grades 5 to 12, and providing a window into the hurdles and hiccups of such an overhaul.
But Munster isn't the first to go digital:
Schools in Mooresville, N.C., for example, started moving away from printed textbooks four years ago, and now 90 percent of their curriculum is online.
Munster’s is part of a new wave of digital overhauls in the two dozen states that have historically required schools to choose textbooks from government-approved lists. Florida, Louisiana, Utah and West Virginia approved multimedia textbooks for the first time for the 2011-12 school year, and Indiana went so far as to scrap its textbook-approval process altogether, partly because, officials said, the definition of a textbook will only continue to fracture.
The cost? Munster has paid $1.1 million for infrastructure while parents pay an annual $150 rental fee for laptops, Schools in general: are "spending an estimated $2.2 billion on educational software."

The benefits? No efficacy data is cited, of course. Students, to some extent, get to work at their own rates. And then there's this:
Angela Bartolomeo’s sixth graders spent a recent Wednesday rearranging terms of equations on an interactive Smart Board and dragging-and-dropping answers in ways that chalkboards never could. (In between, a cartoon character exclaimed that “Multiplying by 1 does not change the value of a number!” in his best superhero baritone.)
And this:
Ms. Norman, the seventh-grade science teacher, is using material from Discovery Education, which on that Wednesday included videos from Discovery’s “Mythbuster” series (commercial-free), an interactive glossary and other eye candy to help students investigate whether cellphones cause cancer. When Ms. Norman told the students to take out their ear buds to watch a video, two in the back yelped, “Cool!”
And this:
“With a textbook, you can only read what’s on the pages — here you can click on things and watch videos,” said Patrick Wu, a seventh grader. “It’s more fun to use a keyboard than a pencil. And my grades are better because I’m focusing more.”
Whether students are focusing on the right things is another matter. And wouldn't it be nice if there were explanations, rather than exclamations, regarding what happens when you multiply a number by 1? The basic problem with computerized instruction, as I noted earlier, is that it almost never provides perspicuous feedback. Answers are either right, or wrong, and that's it.

Perhaps no one knows the limitations of computer software better than computer software experts. Where are these people sending their kids to school?  An article in this weekend's New York Times provides a glimpse. Focusing on Silicon valley, it describes how the chief technology officer of eBay, along with "employees of Silicon Valley giants like Google, Apple, Yahoo and Hewlett-Packard," are sending their children to the area's Waldorf school:
The school’s chief teaching tools are anything but high-tech: pens and paper, knitting needles and, occasionally, mud. Not a computer to be found. No screens at all. They are not allowed in the classroom, and the school even frowns on their use at home.
Noting that "three-quarters of the students here have parents with a strong high-tech connection," the Times observes:
Schools nationwide have rushed to supply their classrooms with computers, and many policy makers say it is foolish to do otherwise. But the contrarian point of view can be found at the epicenter of the tech economy, where some parents and educators have a message: computers and schools don’t mix.
The article quotes Waldorf parent Alan Eagle, who "holds a computer science degree from Dartmouth and works in executive communications at Google, where he has written speeches for the chairman," and who "uses an iPad and a smartphone:"
“I fundamentally reject the notion you need technology aids in grammar school... The idea that an app on an iPad can better teach my kids to read or do arithmetic, that’s ridiculous.”
Of course, there is one particular way in which computers could be highly effective teaching tools: for instruction in computer programming. But has there has been a rise, or a decline, in computer programming instruction in the decades since schools began jumping on the edtech bandwagon? We might as well ask whether the pope is Jewish.

(Cross-posted at Out In Left Field)

Sunday, October 23, 2011

Definitions, Precision, Coherence

What's missing from today's school math, and particularly, from middle school math?

Definitions, precision, and coherence.

Without a proper introduction to definitions, students don't get clarity in what a math statement is and what it is not. They don't get a sense of the abstractness of it. Without having actual definitions, they don't learn to work with definitions, so they can never learn how to use them to derive new things that are true of "all" of a given set. Without definitions, they cannot learn to REASON about mathematics because they have no basis for reasoning.

Without precision, students cannot make clear, unambiguous statements. They are not able to properly manipulate the symbols they are given, and they can't even say what their manipulations refer to and what they do not refer to. (As Wu says, In the same way that we do not ask “Is he six feet tall?” without saying who “he” is, we do not write down xyzrstuvw = a + 2b + 3cdefghijklmn
without first specifying what a, b, . . . , z stand for either (this is
an equality between what? Two random collections of symbols??
What does it mean??))

Without coherence, math is a series of unrelated facts designed merely to trick students. Without coherence, math is just a test of how big your working memory can be when you can never integrate any understanding together. There is no notion that the results you get follow from other results. Reasoning can't exist without coherence.

The reason SAT math feels so "tricky" to so many students is because they were never taught math in a coherent, reasoned way with definitions and precision. So the test seems to bejust a set of tricks designed to "catch" you, as opposed to a test of how what you know relates to other things you already know. That is why it seems to just test working memory. That is why it seems to have so much associative interference. The reason is because you were never taught that math was reasoned, with every piece of it following from the other pieces, a seamless whole that reinforced the same truths from a zillion different directions. And you were never taught what it meant.

summer boarding school and SAT prep

I talked to some friends I hadn't seen in awhile last night. They told me that over the summer they sent one of their kids to a summer boarding program where he prepared for the SAT.

His mom said he gained 60 points on reading, 90 points on writing, and 180 points on math.

I'm pretty sure this is the program: Wolfeboro The Summer Boarding School.

Steve H on SAT math and math prep

from the comments:
SAT math tries to trick students. You could say that the tricks relate directly to whether or not they really understand math. However, when you add in the time constraints, it really relates to preparation. Is preparation the same as mastery? Yes. Mastery of the test. Is this equivalent to mastery of math or whether you will do well in college math? Not necessarily. There are better ways of determining that than with the limited material included on the SAT. Why not just require students to take the Achievement Test? Look at the AP Calculus grade.

What is it about SAT-Math that is so important? They are trying to test something other than just math knowledge. They think that these tricky questions reflect on how well you think on your feet, but what it really does is test preparation and whether you have seen these questions before. The questions don't reflect on whether you have a wide body of knowledge and skills in math.

They create problems where you have to "see" the shortcut. You get problems with hidden 3-4-5 triangles. Add a time constraint and then what do you call those problems? It's not just about math knowledge and skills. The problem has to do with trying to determine the difference between aptitude and preparation. The tricks may have some basis in meaningful math, but that's not what they are trying to test.

It reminds me of questions companies like to ask at job interviews, like "Why is a manhole cover round", and "How many golf courses are there in the US.?" Preparation can make you look like you have a great aptitude. Preparation is directly related to math knowledge, and that is important, but identifying aptitude is an arms race for something like the SAT. That's causing the tricky problems, not any desire to test a breadth and depth of math knowledge.

In Dick Feynman's books, he talks about how he spent a lot of time in high school learning about all sorts of trick, lateral thinking problems. He would challenge people to ask him questions. There is nothing like preparation to make you look like a genius, although he really didn't need help with that. It really annoyed some of his colleagues.

My son will get to calculus in his junior year and he always gets A's. He still has to prepare for SAT-Math. He can't let others, with specific SAT-Math preparation, seem like they have a better aptitude than him.
They try to trick students in most questions....What bothers me the most are the shortcut problems where using standard math techniques cause you to take too much time. This is supposed to identify aptitude, but it really tests preparation for the test.

There are also the problems where using a brute force or direct counting technique works better than any applied math technique. In some cases, there is no math to apply. One question on a sample PSAT test asked for the number of positive integers less than 1000 which don't have a '7' as one of the digits. (notice - "don't have" and positive integer) This simply checks how well you work under time pressure. Nobody expects you to apply any fancy math to this problem. One of the answers was the "have" solution. This tests preparation and practice, not aptitude or math ability. There may be a correlation between the test and aptitude or math ability, but not to the resolution colleges use it to select students. At the top levels, it correlates to preparation. That's not necessarily a bad thing, but there are better ways of figuring that out.

rat psych - "careless errors" in reading the SAT

During my year of living dangerously, doing SAT math prep off and on with C., I was chronically amazed stunned by the number and type of "careless errors" he and I both made taking timed sections of the test. In particular, I made repeated errors of "simple" reading, particularly when I was tired or the room was hot. I made so many reading errors that when I finally took the real test, I had no way to predict my math score at all: no way to estimate how many reading errors I had -- or had not -- made.

I eventually came up with a theory of careless errors, the details of which I've forgotten at the moment. I do recall that it had to do with working memory. Arguably the SAT tests working memory above all: all 10 sections put you into working memory blowout. I experienced working memory blowout so often that I began to notice a connection. As far as I can tell, you make more careless errors when your working memory is overtaxed (and you hit the limits of working memory much more quickly when you're sleep-deprived or overheated).

I've just come across a new study that I think confirms my subjective experience:
This study resolves two long-standing debates in the field. Does our working memory function like slots, and after our four slots [emphasis added] are filled with objects we cannot take in any more; or does it function like a pool that can accept more than four objects, but as the pool fills the information about each object gets thinner? And is the capacity limit a failure of perception, or of memory? [emphasis added]

“Our study shows that both the slot and pool models are true,” says Miller. “The two hemispheres of the visual brain work like slots, but within each slot, it’s a pool. We also found that the bottleneck is not in the remembering, it is in the perceiving.” [emphasis added] That is, when the capacity for each slot is exceeded, the information does not get encoded very well. The neural recordings showed information about the objects being lost even as the monkeys were viewing them, not later as they were remembering what they had seen.
Picower: 1 Skull + 2 Brains = 4 Objects in Mind
Failures of working memory are failures of perception!

Subjectively, that's what I experienced taking practice sections; that's what it felt like. Once I hit a certain level of tiredness, or heat, or working memory blow-out, I stopped being able to read.

The same thing happens on the reading and writing sections, too. The reading and writing sections are so taxing that you reach points where you simply cannot take in what the sentence or paragraph before you says. * I'm not talking about losing the ability to answer questions about the sentence or paragraph.

I'm talking about losing the ability just to read the words on the page.

I'm a 10
rat psych: what to do about SAT math (part 1)
rat psych: what to do about SAT math (part 2)
rat psych: what to do about SAT math (part 3)
rat psych: careless reading errors on the SAT

* I say "you" because I know I am not alone in this.

rat psych - what to do about SAT math (part 3)

Your typical high school student, I presume, has spent several years setting up equations and solving for x. At least, let's hope so. I certainly did.

The SAT uses this fact to elicit many wrong answers from test-takers who have worked a problem correctly. The student gets the solution right but the answer wrong because the answer isn't x. The answer is 3x, say, or xy. I seem to recall a problem or two where the answer was -x, for god's sake, but I might be making that up.

Other times the test will give you a value for x + y, say, and you're supposed to see that you should simply insert that value some place else in the problem, et voilà: the answer they're looking for pops up.

Here's a typical problem, medium difficulty (according to the College Board):
If 4(x + y)(x - y) = 40 and (x - y) = 20, what is the value of x + y?
A kid who's had no test prep at all will likely miss this question -- either miss it outright or take too much time spotting the solution, thus leaving him too little time to finish the test and increasing the likelihood he'll make "careless errors" on the questions he does get to because now he's working too fast trying to make up for the time he lost on the x + y problem.

For what it's worth, I think using x + y as the value, instead of x or y alone, is an interesting and instructive way to write a problem. (I'm curious what math people think). It seems to me that writing problems in which x + y is the salient unit may be a way of teaching what Ron Aharoni calls the fifth fundamental operation of arithmetic:
In addition to the four classical operations, there is a fifth one that is even more fundamental and important. That is, forming a unit, taking a part of the world and declaring it to be the “whole.” This operation is at the base of much of the mathematics of elementary school. First of all, in counting, when you have another such unit you say you have “two,” and so on. The operation of multiplication is based on taking a set, declaring that this is the unit, and repeating it. The concept of a fraction starts from having a whole, from which parts are taken. The decimal system is based on gathering tens of objects into one unit called a “10,” then recursively repeating it.

The forming of a unit, and the assigning of a name to it, is something that has to be learned and stressed explicitly. I met children who, in fifth grade, knew how to find a quarter of a class of 20, but had difficulty understanding how to find “three-quarters” of the class, having missed the stage of the corresponding process of repeating a unit in multiplication. What I Learned in Elementary School by Ron Aharoni
Maybe I'm wrong, but it seems to me that the x+y questions test math as opposed to obedience under pressure, which is what the Find xy questions test.

Still, there is no doubt in my mind that these questions elicit wrong answers from test takers who know the math involved, can do the math involved, and have a reasonable understanding of the math involved. Students who have spent years of their lives solving for x aren't going to break the Solve for x habit for the first time ever when they're working at breakneck speed and their eyes are bleeding from the Ella Baker passage.

Which brings me back to extinction learning. Test prep for SAT math involves spending a fair amount of time building new habits that conflict with ingrained old habits. You've been conditioned to solve for x; now you have to condition yourself not to solve for x. Also, you have to build as much speed as possible at not solving for x because you are never going to forget solve-for-x. The two impulses are inside your head, competing with each other, and the competition takes time (and probably eats up some precious working memory resources to boot).

Funny thing: during the time we spent doing SAT math prep around here, I overlearned don't solve for x to the degree that a couple of weeks before taking the real test I came across a practice problem that did ask the test-taker to solve for x. I was so surprised that I wasted several seconds reading and re-reading and re-reading again to make sure I hadn't misunderstood. You can't win.

For parents: your child needs to spend enough time not solving for x that he or she gets to be really, really fast at not solving for x.

Then he should be on the lookout for problems that say Solve for -x.

I'm a 10
rat psych: what to do about SAT math (part 1)
rat psych: what to do about SAT math (part 2)
rat psych: what to do about SAT math (part 3)
rat psych: careless reading errors on the SAT