kitchen table math, the sequel: math professor's experience coaching the SAT

Tuesday, November 1, 2011

math professor's experience coaching the SAT

The Effectiveness of SAT Coaching on Math SAT Scores
(pdf file) by Jack Kaplan
Chance Magazine | Volume 18, Number 2, 2005

Comment: Jack Kaplan's "A New Study of SAT Coaching"" (pdf file)
By Derek C. Briggs

Here's the Fairtest write-up of Kaplan's coaching results:
Kaplan's course includes 20 hours of classroom instruction and seven to 12 hours of individual tutoring. Of the 34 previously uncoached students he worked with over six summers, 21 increased their Math scores by more than 80 points from a baseline SAT administration with 11 going up by 100 points or more. Average score increases were higher for students with lower initial scores. In a control group Kaplan studied, average scores increased by only 15 points during the same period. Thus, he concludes that "the estimated coaching effect was 63 points" on the Math component of the SAT.
Kaplan was coaching the SAT math test being given prior to the 2006 changes.

6 comments:

Debbie Stier said...

Yikes. My Kaplan online course experience was terrible as well.

Cranberry said...

The comment by Derek C. Briggs is interesting.

Also intriguing that none of the coached student scores in Kaplan's charts reach 700. Increasing a student's score from 430 to 530 would be an accomplishment, but it's probably not what parents are hoping for when they pay $25 K.

Avoiding score decreases seems to be a feature of Kaplan's tutoring. Only two of his coached students lost points modestly (-10 to 10), while 26 of the control students were flat to decreasing--up to -70 points. On the other hand, as many universities super-score the tests, avoiding decreases is not exciting.

Maybe 700 is a boundary of some sort.

Catherine Johnson said...

wait - cranberry - are you talking about Kaplan the test prep company or Kaplan the Quinnipiac guy - ? (I haven't read the Briggs article in - gosh, maybe two years. You jogged my memory today. I had never seen the full Kaplan article because it wasn't online then - haven't read yet.)

Cranberry said...

Kaplan the author of the Chance article. As I said, interesting graphs, but the sample's too small.

Anonymous said...

The Online Kaplan course is a complete waste of money. And, as soon as you have completed the online diagnostic test (which would be free if you had not signed up for the course) there are no refunds to be had. $150 for absolutely zero.

kcab said...

In thinking about SAT test prep and scores the other day, I realized that I've been assuming that teaching (hopefully correlated with years in school) should affect scores on the SAT or ACT. My daughter took the SAT as a middle school student and I hope her scores rise when she takes it next as a junior! Although she did well, with one section in the 700's, those other two sections were right in the talent search middle of the pack, in the 500's. The talent search mean scores are in the high 400's & low 500's on each section. Since the talent searches pull in kids who score 95% and up on state tests, then I'd expect later SAT mean scores for the same set of kids to be somewhere in the 90% and up, or at least well above the mean for juniors/seniors.

I looked for data on whether scores actually do rise from middle school to HS, and I didn't really find that. All I could turn up was anecdote on College Confidential and score differences between middle school years reported by talent search. For instance, last year's CTY SAT results show about a 40 point difference between 7th and 8th graders in each of reading and math (writing results aren't presented). (I'm looking at the results in CTY's document, which are annoyingly separated into male and female participants. The score difference is approximately the same.) Writing probably rises more with each year, Duke TIPS 7th grade score results show the average writing score as 410. (LOL, the 99th percentile for Duke TIPS writing scores extended all the way down to 600!) I found some numbers for a much smaller talent search, RMATS, that reported for 6th through 8th graders and had ~50 pt gains on each section for each year. Anyway, my point is that it looks like SAT scores go up with years in school, and that the change is not that different from some of the numbers reported for test prep.

Also wanted to say that one way the SAT shortchanges (very) highly intelligent kids is that their score doesn't have as much room to grow, they may not be able to demonstrate how much they know. That is, a middle school kid who scores in the 700's is not going to have a 40 pt gain per year of school. Whether that matters or not, I don't know.

OK, now to go find something else on which to waste time...the above probably seems totally pointless. I feel a little better though.