Yeah, don't get excited by assessment. Most college assessment is crap (and I say this as I look ahead to creating my annual departmental assessment documents, which are a lot of language dressing up the fact that we gave exams in a chemistry department -- I know you are all shocked!)
We have all learned how to construct rubrics, though.
I'd prefer subject-based tests to general tests of thinking skills such as the Collegiate Learning Assessment. Why not encourage all students to take the GRE test in their major, if it is available, and collect average scores by school?
The current GRE subjects available are
Biochemistry, Cell and Molecular Biology Biology Chemistry Computer Science Literature in English Mathematics Physics Psychology
I've wondered about the seemingly small growth in "critical thinking" skills. It is possible that college is for content knowledge, and rests upon cognitive and analytical skills that have to be mastered by age 18 (in a broad sense).
No, cognitive and analytical skills don't have to be "mastered" by age 18! Those are skills which improve with use and CONTINUED addition of content knowledge.
As students get past 18 is it hoped that they have developed the skill of putting newly learned content into the knowledge they already have. Instead of discrete packets of knowledge, they are able to begin to see how those packets combine and interact.
Content and practice first, cognition, analysis and then deeper understanding grows from those two.
After studying the online reaction to the Arum and Roksa book, I'm amazed at the different angles and interpretations. Some think this is another move towards the evils of testing and cranking out more rote cogs in a big capitalistic industrial mill. Some complain about how kids go into huge debt without seeing any economic results. Some want another excuse to bash "Neocons."
The problem with Brooks' column is that he needs to do more homework. He doesn't seem to understand the issues of K-12 education and the argument over different types of assessment.
"Some schools like Bowling Green and Portland State are doing portfolio assessments — which measure the quality of student papers and improvement over time. Some, like Worcester Polytechnic Institute and Southern Illinois University Edwardsville, use capstone assessment, creating a culminating project in which the students display their skills in a way that can be compared and measured."
WPI's measures of merit will have no correlation with those of Bowling Green. The goal is to compare schools, not students.
You could start by looking at the average incoming SAT scores. Where I taught college, I could never teach at the same level as the University I went to. You just can't flunk everyone. I remember one part-time teacher who did this. He taught no more classes. This doesn't mean, however, that the college does a good job of adding value.
Most all colleges test and grade. Apparently these tests are flawed compared to the ones used by Arum and Roksa. That seems to be the problem. It's not an issue of testing or not. They are claiming that colleges are either incompetent, their testing methods are wrong, or that they grade too easily.
Assuming that you are concerned about a future career and job prospects, and assuming that you don't think that that is incompatible with a proper liberal arts education, then there are other measures that can be used. Bostonian suggests the GRE. How about a list showing where the undergraduates go to graduate school? How about numbers showing how many students end up with jobs in their field of study? How about showing their starting salaries? Show which companies prefer the graduates from each department.
For K-12, use the SAT, ACT, and/or AP scores, not the silly portfolios and senior exhibition that our state requires. Why use CCSS when you already have the SAT and Accuplacer?
"Meanwhile, according to surveys of employers, only a quarter of college graduates have the writing and thinking skills necessary to do their jobs."
What assessment will fix this problem? How do you even define the problem? How do you define this level for high school graduates versus college graduates? How do assessments translate back into curriculum and expectations? When our middle school got results back from the state showing poor "problem solving" results, the committee decided that the school should spend more time on problem solving.
"There has to be a better way to get data so schools themselves can figure out how they’re doing in comparison with their peers."
Based on what, some vague idea of learning, or based on keeping career doors open? College doors are defined by GPA and SAT, not state CCSS scores. Post-college doors are defined by things like a college/department's rank or reputation, GPA, and tests like the GRE. Assessments already exist. Why did the CCSS people have to do their own "workplace analysis" where one size fits all?
"If you’ve got a student at or applying to college, ask the administrators these questions: “How much do students here learn? How do you know?”"
How about asking for the numbers on freshman retention, graduation rate, and the percent who get jobs in their fields of study within a year of graduation?
"How about asking for the numbers on freshman retention, graduation rate, and the percent who get jobs in their fields of study within a year of graduation?"
That data we have, except for jobs in the field of study. Freshman retention and graduation rate are part of US News Ranking, for example. Placement information is really hard to get and really easy to game -- students who are successful fill out the reply forms and those who aren't don't, so it is easy to create inflated data. Look at law schools, who are now being criticized for just that, as well as for hiring graduates to work for the law school.
But even first to second year retention rate is a flawed metric. Top liberal arts colleges keep almost all of their admits, but even there women's colleges suffer because they don't have the same retention rate -- some young women decide it is the wrong environment and leave, but a student admitted to Smith or Bryn Mawr almost certainly graduates from another school if she leaves. My own school admits a lot of first generation students, and we have quite a few who stay a year and decide they can't afford a private school. Those who leave don't necessarily drop out. A lot transfer to less expensive state schools.
That doesn't mean that freshman retention isn't a useful metric -- certainly combined with incoming SAT scores, it tells you a lot -- but it isn't quite as cut and dried as it might seem.
I know that any number considered to be important will be gamed. However, students and parents have to play the same game as the colleges; try to find some nationally-calibrated numbers and then add in their own judgment. At each level, it may be hard to tell schools apart, but you can't create one number that gives you the answer. There are already many numbers available, and I don't think Brooks' examples will help.
Then there are the differences between departments, which can be more important. If you don't know what degree you are interested in, then probably the best route is to go to a local CC or University to figure it out. It's often easier to transfer into a school than it is to get into the school as a freshman.
"... but it’s not clear how much actual benefit they are providing. Colleges are supposed to produce learning."
They are supposed to provide a path to a career - especially if you start bringing cost/benefit into the picture. This points to a different metric - by department.
However, vague tidbits like this are thown out.
"Meanwhile, according to surveys of employers, only a quarter of college graduates have the writing and thinking skills necessary to do their jobs."
Will they do the hard work to break this down into details? I'll wager that the problem really resides in K-12 - assuming that they ever decide to define the problem.
Our high school saw many kids with basic math problems. Was their solution to fix the problems in K-8? No. They added a skills lab to one of the algebra classes.
SteveH writes that colleges "are supposed to provide a path to a career". What would he do with history departments and history majors? The only career a B.A. in history directly prepares you for is that of history professor, and only if you get a PhD in history.
History departments should focus on teaching history well, and how a B.A. in history fits into a future career is for the history majors to figure out. And there lots more history PhDs produced each year than history professorships.
I think student lending should be privatized, and if lenders think history majors are unemployable, they won't fund their studies.
Even in a more practical department such as computer science, I would not expect the professors to often change the curriculum to fit current IT hiring trends.
What I'm complaining about is an article that claims to speak to all students, all departments, and all situations. To say that colleges don't get the job done and then refer to vague things like surveys of what employers want is not very clear or helpful. What do they want to measure, level of learning or preparation to make employers happy?
There have always been students who go into fields with fewer opportunities; some with their eyes wide open and some without. What's new is the drive to push all students into college and the willingness for some colleges to accept almost anyone.
SteveH wrote: "What's new is the drive to push all students into college and the willingness for some colleges to accept almost anyone."
I agree this is a bad trend, but it has been growing for decades. Jackson Toby wrote a book about it, "The Lowering of Higher Education in America: Why Financial Aid Should Be Based on Student Performance", which I recommend.
I am DESPERATE for decent assessments ... I **think** I'm desperate for an e-assessment program for paragraphs and essays, given the study that just came out showing that software assessments of writing come up with the same scores human graders do.
Obviously, I would carry on reading my students' work, but I would **love** to have two things:
a) a software 'check' on my reading b) a way to assign more writing & give more feedback without cloning myself
I basically have practically no way to tell if my students are getting better at writing. (That's a curriculum issue, too, obviously.)
I like the writing course I'm teaching; I like the design and the thinking behind it.
BUT I need a set of coherent, sequential exercises I can assign that will let me assess progress -- or, failing that, I need some kind of assessment system that gives me the same information.
I've been slowly-but-surely writing the exercises, which I base in Whimbey's work, along with Killgallon, Vande Kopple, & Kolln.
SteveH writes that colleges "are supposed to provide a path to a career". What would he do with history departments and history majors? The only career a B.A. in history directly prepares you for is that of history professor, and only if you get a PhD in history.
That's not exactly true.
As far as I can tell, (some) employers tend to see liberal arts majors as having analytical and writing skills along with some kind of 'broad' understanding of .... how we got where we are(?)
Employers tell Ed that history majors know how to think !
(Ed has always argued that's what a liberal arts major gives you: a trained mind.)
That's the great irony of education schools. They've been trying to kill the liberal arts for 100 years now, and if there is any course of study that actually **does** teach a student to THINK, arguably it's the liberal arts.
I have to say ... I don't find rubrics particularly easy to use ... BUT, that said, I finally got my hands on the exact rubric used to score exit exams in my department (5-paragraph essays).
It's quite helpful, partly because it actually does describe, reasonably closely, how full-time faculty members score the exams.
Beyond that, though, it's a pretty decent list of what goes into a brief essay.
Yeah, don't get excited by assessment. Most college assessment is crap (and I say this as I look ahead to creating my annual departmental assessment documents, which are a lot of language dressing up the fact that we gave exams in a chemistry department -- I know you are all shocked!)
ReplyDeleteWe have all learned how to construct rubrics, though.
I'd prefer subject-based tests to general tests of thinking skills such as the Collegiate Learning Assessment. Why not encourage all students to take the GRE test in their major, if it is available, and collect average scores by school?
ReplyDeleteThe current GRE subjects available are
Biochemistry, Cell and Molecular Biology
Biology
Chemistry
Computer Science
Literature in English
Mathematics
Physics
Psychology
I've wondered about the seemingly small growth in "critical thinking" skills. It is possible that college is for content knowledge, and rests upon cognitive and analytical skills that have to be mastered by age 18 (in a broad sense).
ReplyDeleteNo, cognitive and analytical skills don't have to be "mastered" by age 18! Those are skills which improve with use and CONTINUED addition of content knowledge.
ReplyDeleteAs students get past 18 is it hoped that they have developed the skill of putting newly learned content into the knowledge they already have. Instead of discrete packets of knowledge, they are able to begin to see how those packets combine and interact.
Content and practice first, cognition, analysis and then deeper understanding grows from those two.
Walk this way!
ReplyDeletehttp://www.apt11d.com/2012/04/testing-colleges.html
After studying the online reaction to the Arum and Roksa book, I'm amazed at the different angles and interpretations. Some think this is another move towards the evils of testing and cranking out more rote cogs in a big capitalistic industrial mill. Some complain about how kids go into huge debt without seeing any economic results. Some want another excuse to bash "Neocons."
ReplyDeleteThe problem with Brooks' column is that he needs to do more homework. He doesn't seem to understand the issues of K-12 education and the argument over different types of assessment.
"Some schools like Bowling Green and Portland State are doing portfolio assessments — which measure the quality of student papers and improvement over time. Some, like Worcester Polytechnic Institute and Southern Illinois University Edwardsville, use capstone assessment, creating a culminating project in which the students display their skills in a way that can be compared and measured."
WPI's measures of merit will have no correlation with those of Bowling Green. The goal is to compare schools, not students.
You could start by looking at the average incoming SAT scores. Where I taught college, I could never teach at the same level as the University I went to. You just can't flunk everyone. I remember one part-time teacher who did this. He taught no more classes. This doesn't mean, however, that the college does a good job of adding value.
Most all colleges test and grade. Apparently these tests are flawed compared to the ones used by Arum and Roksa. That seems to be the problem. It's not an issue of testing or not. They are claiming that colleges are either incompetent, their testing methods are wrong, or that they grade too easily.
Assuming that you are concerned about a future career and job prospects, and assuming that you don't think that that is incompatible with a proper liberal arts education, then there are other measures that can be used. Bostonian suggests the GRE. How about a list showing where the undergraduates go to graduate school? How about numbers showing how many students end up with jobs in their field of study? How about showing their starting salaries? Show which companies prefer the graduates from each department.
For K-12, use the SAT, ACT, and/or AP scores, not the silly portfolios and senior exhibition that our state requires. Why use CCSS when you already have the SAT and Accuplacer?
"Meanwhile, according to surveys of employers, only a quarter of college graduates have the writing and thinking skills necessary to do their jobs."
What assessment will fix this problem? How do you even define the problem? How do you define this level for high school graduates versus college graduates? How do assessments translate back into curriculum and expectations? When our middle school got results back from the state showing poor "problem solving" results, the committee decided that the school should spend more time on problem solving.
"There has to be a better way to get data so schools themselves can figure out how they’re doing in comparison with their peers."
Based on what, some vague idea of learning, or based on keeping career doors open? College doors are defined by GPA and SAT, not state CCSS scores. Post-college doors are defined by things like a college/department's rank or reputation, GPA, and tests like the GRE. Assessments already exist. Why did the CCSS people have to do their own "workplace analysis" where one size fits all?
"If you’ve got a student at or applying to college, ask the administrators these questions: “How much do students here learn? How do you know?”"
How about asking for the numbers on freshman retention, graduation rate, and the percent who get jobs in their fields of study within a year of graduation?
"How about asking for the numbers on freshman retention, graduation rate, and the percent who get jobs in their fields of study within a year of graduation?"
ReplyDeleteThat data we have, except for jobs in the field of study. Freshman retention and graduation rate are part of US News Ranking, for example. Placement information is really hard to get and really easy to game -- students who are successful fill out the reply forms and those who aren't don't, so it is easy to create inflated data. Look at law schools, who are now being criticized for just that, as well as for hiring graduates to work for the law school.
But even first to second year retention rate is a flawed metric. Top liberal arts colleges keep almost all of their admits, but even there women's colleges suffer because they don't have the same retention rate -- some young women decide it is the wrong environment and leave, but a student admitted to Smith or Bryn Mawr almost certainly graduates from another school if she leaves. My own school admits a lot of first generation students, and we have quite a few who stay a year and decide they can't afford a private school. Those who leave don't necessarily drop out. A lot transfer to less expensive state schools.
That doesn't mean that freshman retention isn't a useful metric -- certainly combined with incoming SAT scores, it tells you a lot -- but it isn't quite as cut and dried as it might seem.
I know that any number considered to be important will be gamed. However, students and parents have to play the same game as the colleges; try to find some nationally-calibrated numbers and then add in their own judgment. At each level, it may be hard to tell schools apart, but you can't create one number that gives you the answer. There are already many numbers available, and I don't think Brooks' examples will help.
ReplyDeleteThen there are the differences between departments, which can be more important. If you don't know what degree you are interested in, then probably the best route is to go to a local CC or University to figure it out. It's often easier to transfer into a school than it is to get into the school as a freshman.
So what is Brooks' problem?
ReplyDelete"... but it’s not clear how much actual benefit they are providing. Colleges are supposed to produce learning."
They are supposed to provide a path to a career - especially if you start bringing cost/benefit into the picture. This points to a different metric - by department.
However, vague tidbits like this are thown out.
"Meanwhile, according to surveys of employers, only a quarter of college graduates have the writing and thinking skills necessary to do their jobs."
Will they do the hard work to break this down into details? I'll wager that the problem really resides in K-12 - assuming that they ever decide to define the problem.
Our high school saw many kids with basic math problems. Was their solution to fix the problems in K-8? No. They added a skills lab to one of the algebra classes.
SteveH writes that colleges "are supposed to provide a path to a career". What would he do with history departments and history majors? The only career a B.A. in history directly prepares you for is that of history professor, and only if you get a PhD in history.
ReplyDeleteHistory departments should focus on teaching history well, and how a B.A. in history fits into a future career is for the history majors to figure out. And there lots more history PhDs produced each year than history professorships.
I think student lending should be privatized, and if lenders think history majors are unemployable, they won't fund their studies.
Even in a more practical department such as computer science, I would not expect the professors to often change the curriculum to fit current IT hiring trends.
What I'm complaining about is an article that claims to speak to all students, all departments, and all situations. To say that colleges don't get the job done and then refer to vague things like surveys of what employers want is not very clear or helpful. What do they want to measure, level of learning or preparation to make employers happy?
ReplyDeleteThere have always been students who go into fields with fewer opportunities; some with their eyes wide open and some without. What's new is the drive to push all students into college and the willingness for some colleges to accept almost anyone.
SteveH wrote:
ReplyDelete"What's new is the drive to push all students into college and the willingness for some colleges to accept almost anyone."
I agree this is a bad trend, but it has been growing for decades. Jackson Toby wrote a book about it, "The Lowering of Higher Education in America: Why Financial Aid Should Be Based on Student Performance", which I recommend.
chemprof wrote: Most college assessment is crap
ReplyDeleteI am DESPERATE for decent assessments ... I **think** I'm desperate for an e-assessment program for paragraphs and essays, given the study that just came out showing that software assessments of writing come up with the same scores human graders do.
Obviously, I would carry on reading my students' work, but I would **love** to have two things:
a) a software 'check' on my reading
b) a way to assign more writing & give more feedback without cloning myself
I basically have practically no way to tell if my students are getting better at writing. (That's a curriculum issue, too, obviously.)
I like the writing course I'm teaching; I like the design and the thinking behind it.
ReplyDeleteBUT I need a set of coherent, sequential exercises I can assign that will let me assess progress -- or, failing that, I need some kind of assessment system that gives me the same information.
I've been slowly-but-surely writing the exercises, which I base in Whimbey's work, along with Killgallon, Vande Kopple, & Kolln.
SteveH writes that colleges "are supposed to provide a path to a career". What would he do with history departments and history majors? The only career a B.A. in history directly prepares you for is that of history professor, and only if you get a PhD in history.
ReplyDeleteThat's not exactly true.
As far as I can tell, (some) employers tend to see liberal arts majors as having analytical and writing skills along with some kind of 'broad' understanding of .... how we got where we are(?)
Employers tell Ed that history majors know how to think !
(Ed has always argued that's what a liberal arts major gives you: a trained mind.)
That's the great irony of education schools. They've been trying to kill the liberal arts for 100 years now, and if there is any course of study that actually **does** teach a student to THINK, arguably it's the liberal arts.
We have all learned how to construct rubrics, though.
ReplyDeleteoh, man
re: rubrics
ReplyDeleteI have to say ... I don't find rubrics particularly easy to use ... BUT, that said, I finally got my hands on the exact rubric used to score exit exams in my department (5-paragraph essays).
It's quite helpful, partly because it actually does describe, reasonably closely, how full-time faculty members score the exams.
Beyond that, though, it's a pretty decent list of what goes into a brief essay.
I have NO idea how to grade papers. I'm teaching a 'developmental' writing course - so what grade should I be giving?
ReplyDeleteI need some way to turn my grading into a Keller plan, which is what I'm thinking about now.