kitchen table math, the sequel: Did you celebrate National SAT Day?

Thursday, June 24, 2010

Did you celebrate National SAT Day?

Because according to this new SAT blog by Sol Lederman it was yesterday, June 23rd. He quotes Wikipedia:
The first administration of the SAT occurred on June 23, 1926, when it was known as the Scholastic Aptitude Test.[19][20] This test, prepared by a committee headed by Princeton psychologist Carl Campbell Brigham, had sections of definitions, arithmetic, classification, artificial language, antonyms, number series, analogies, logical inference, and paragraph reading. It was administered to over 8,000 students at over 300 test centers. Men composed 60% of the test-takers. Slightly over a quarter of males and females applied to Yale University and Smith College.[20] The test was paced rather quickly, test-takers being given only a little over 90 minutes to answer 315 questions.[19]
I'm guessing that there were no SAT tutors then.

12 comments:

Bostonian said...

As the Gene Expression blog put it, the SAT is a de facto IQ Test .

"SAT measures more than student performance, research shows it is also a reliable measure of IQ
Each year thousands of high school students take the Scholastic Assessment Test, or SAT, hoping to gain admission to the college of their choice. Colleges and universities use SAT scores to help project a prospective student's performance. But research shows there is more to the SAT, that it is really an intelligence test.

Meredith C. Frey and Douglas K. Detterman, researchers at Case Western Reserve University, have shown that students' SAT test scores correlate as highly as, and sometimes higher than, IQ tests correlate with each other. This is strong evidence that the SAT is a de facto intelligence test. Their findings will be published in the June issue of Psychological Science, a journal of the American Psychological Society.

While this finding may be surprising to many who take the test, it was no surprise to the researchers. The origins of the SAT can be traced back to intelligence tests that were originally given to screen entrants into the armed forces. Many who study intelligence had suspected that the SAT was an intelligence test though it seems no one had ever investigated the relationship.

The Case investigators studied the SAT for two reasons. First, they were looking for an easy way to obtain a measure of IQ for students who participate in their experiments on more basic cognitive processes. Giving an IQ test can take 30 to 90 minutes, and with a correlation between IQ and SAT scores, researchers now have a fairly accurate estimate of an individual's IQ without the need to administer a lengthy test. Second, it is useful to know the relationship between the SAT and IQ so that SAT could be used as a measure of IQ in cases where patients' IQs decline due to head injury or diseases like Alzheimer's. It is often important to know what a person's level of intellectual functioning was before the onset of the decline and many people have taken the SAT. According to the researchers, for those who have never taken an IQ test, the SAT could be used as a substitute."

Anonymous said...

If SAT correlates closer with "IQ" than "IQ tests" correlate with each other, chances are good that there isn't quite as good a definition/measure of IQ as a lot of people pretend.

There's a huge disparity between "SAT is an IQ test" and the SAT prep classes that can fairly reliably increase scores in a short amount of time, as IQ is supposed to be inherent and not subject to dramatic changes.

The other option is that IQ - especially when used as a gatekeeper criteria for determining whether students should be allowed to take specific classes - is mostly meaningless.

Anonymous said...

"If SAT correlates closer with 'IQ' than 'IQ tests' correlate with each other, chances are good that there isn't quite as good a definition/measure of IQ as a lot of people pretend."

Quite possible. It kinda depends on how many people is "a lot." Additionally, some IQ tests are worse than others.

What *does* seem to be true is that the SAT verbal does function as a crude IQ test. Much more so than the SAT math test. I might be as snarky as to add that it isn't quite as poor a definition/measure of IQ as a lot of people pretend :-)

"...IQ is supposed to be inherent and not subject to dramatic changes."

Mostly. But this doesn't mean that all IQ tests are a perfect measure of the underlying IQ. The *test* can be gameable, even if the underlying IQ doesn't change. In which case the test is flawed, but we might not be able to fix it (or we might be able to but not want to).

"The other option is that IQ - especially when used as a gatekeeper criteria for determining whether students should be allowed to take specific classes - is mostly meaningless."

This is certainly possible. One thing I would like to see done sometime would be for MIT to admit 50 students/year with Math+Verbal SAT scores around 800/500 and see how those students do. I'd also like to see Harvard/Princeton/Yale/Stanford do the same with scores of M/V:500/500. Do this for ten years and see if these 50 students/year perform the same as the students with the much higher verbal SAT scores.

-Mark Roulo

Catherine Johnson said...

What *does* seem to be true is that the SAT verbal does function as a crude IQ test. Much more so than the SAT math test.

It's a good thing, too, seeing as how I'm Verbally outscoring my Math Self by a good 200 points.

Catherine Johnson said...

I don't remember whether I've posted this article:

The Test Passes, Colleges Fail by Peter D. Salins

First, Stony Brook and Albany, both research universities: over four years, at Stony Brook the average entering freshman SAT score went up 7.9 percent, to 1164, and the graduation rate rose by 10 percent; meanwhile, Albany’s average freshman SAT score increased by only 1.3 percent and its graduation rate fell by 2.7 percent, to 64 percent.

[snip]

Clearly, we find that among a group of SUNY campuses with very different missions and admissions standards, and at which the high school grade-point averages of enrolling freshmen improved by the same modest amount (about 2 percent to 4 percent), only those campuses whose incoming students’ SAT scores improved substantially saw gains in graduation rates.

ChemProf said...

It is also pretty well established that SAT math scores above 650 correlate with completing a math/science/engineering major, even with low verbal scores (which, for your experiment, you might have to specify does not include ESL students who may have awful verbal SATs).

Bostonian said...

Anonymous wrote:

'There's a huge disparity between "SAT is an IQ test" and the SAT prep classes that can fairly reliably increase scores in a short amount of time, as IQ is supposed to be inherent and not subject to dramatic changes.'

Do prep classes reliably increase scores by substantial amounts? Princeton Review has been forced to abandon such claims, according to a 2010 Associated Press article College Board: SAT courses have little effect

'Last year, the National Association for College Admission Counseling released a report concluding that test prep courses have minimal impact in improving SAT scores — about 10-20 points on average in mathematics and 5-10 points in critical reading. The NACAC report noted that this evidence is "contrary to the claims made by many test preparation providers of large increases of 100 points or more on the SAT."

Kathleen Steinberg, a spokeswoman for the College Board, says that on average, students who take the SAT test twice only "increase their scores by about 30 points."

She added that "the College Board does not recommend taking the SAT more than twice, as there is no evidence to indicate that taking the exam more than twice increases score performance."

Parents might also be surprised by actual average SAT scores: 501 in critical reading, 515 in math and 493 in writing, according to Steinberg. (The highest score you can get on any section is 800.)

Kaplan charged that The Princeton Review's claims of score jumps were based on comparing the results of Princeton Review "diagnostic" tests with students' self-reported scores on actual SAT exams, as opposed to actual before-and-after SAT scores.'

Bostonian said...

I think the report that the excerpted A.P. article refers to is Preparation for College Admission Exams, by Derek C. Briggs. The press release is here.

Lisa said...

Sincerely hoping it is the verbal scores that matter and not the math since the disparity between mine was embarrassing.

ChemProf said...

I went back and looked at my copy of "The Big Test," which is a history of the SAT. The SAT was actually designed as an IQ test (which is why it included the dread analogies). When Kaplan started teaching SAT prep classes (mostly for working class Jews), they thought he was teaching cheating methods, because the administrators of the test didn't think you could raise your score that much. However, it turns out he was mostly teaching vocabulary, since the words used on the SAT weren't familiar to his clientele.

These days, when most college bound students know what to expect with the SAT, I'm not sure how much prep-courses help versus taking other practice tests. You do need to worry about things like the Princeton Review showing improvement versus their own test, since they write the initial exam so could make it harder than the SAT (which is the rumor regarding the Barron prep book, as an example).

lgm said...

Chemprof, does a 650 mean mastery of math up to pre-calc, so the student is ready to being the calc sequence and thus able to finish a math/sci/eng degree in a reasonable time frame?

What does the math SAT distribution look like when split by highest level course completed successfully before taking the SAT?

ChemProf said...

Since I am never sure what pre-calc really means -- it seems to vary enormously from place to place -- I doubt it. Even the new SAT only goes up to
Algebra II. I don't have any specific information on where that 650 number comes from though; it is more a general wisdom thing than something I can cite.

If I had to guess, I'd say it is a marker for mastery in general. If you look at the percentage scores mapped onto SAT scores, it is about 80% correct (really a bit higher as you have to remember they subtract .25 for each wrong answer). To do that, you need to be able to answer essentially every question in about a minute.

That means reading the question quickly and efficiently, and having your strategy in hand before you finish reading the question.

The sat-tutors-blog page I've linked below is a good example of what not to do. The problem is:

An integer x is multiplied by 3 and the result is decreased by 3. This result is divided by 3. Finally, that result is increased by 3. In terms of x, what is the final result?
A. x + 3
B. x - 3
C. x + 2
D. 3x -3
E. x

This is suggested as a problem to solve by plugging a number in for x, and then for each answer to see what agrees, but that's too slow! What you need to do is quickly translate the problem into algebra. Using a calculator is also not great, because typing things in slows you down. I'd say that's what the SAT is really testing for here -- speed and fluency.

http://www.sat-tutors-blog.com/2009/02/11/sat-math-when-to-plug-in-numbers-ii-w-examples/