To teach a class called AP anything, you have to have your syllabus approved by the College Board. They want to ensure that you are covering the content that they require. This already annoys me: let my students take the test and see how they do. Their results will indicate whether I am covering the material or not.If the College Board is now simply an outpost of Teachers College, it should say so, out loud.
But it's their name so it's their rules. I submit my curriculum. First try, it comes back rejected: I have not provided evidence that my course is "student centered" and that I use "guided inquiry" to develop "critical thinking skills."
So to use the AP label, you give the College Board authority to define what you teach and how you teach it.
Fortunately (as I ranted to my students), educationists use undefined buzzwords that can mean anything we want them to. So I announced that from this day forward, my class is now 100% student-centered. Can you smell the fresh clean scent? We then continued with the lesson as planned...
We can not make "critical thinking" the goal. Whatever critical thinking may mean (and I don't really know), we better hope that it emerges as a result of the careful work that we do teaching our specific subjects. But the goal should always be to teach the subjects!
When you make "critical thinking" the goal, then someone is going to say: so math/history/French/science is not that important -- let's just teach them how to think critically. What follows from that is nearly always some inane, time-wasting idea for an "activity" that is disconnected from reality and certainly disconnected from the subject I thought we were trying to teach.
Is this the work of David Coleman?
And is the ACT the last man standing?
7 comments:
We are currently forcing through some changes for my HFA middle schooler (it looks like we may actually be getting some kind of direct instruction in reading and writing--in Westchester!--for him, though it still needs to be officially approved), and on the notes from his recent CSE meeting, it describes how he is adjusting poorly to the new ELA model--they are now doing a "writer's workshop approach" instead of "reader's and writer's workshop."
So they are not even pretending to teach reading to middle schoolers anymore.
It really is 100% "student-centered," they all read YA books and are supposed to write something about it. Not even a pretense at doing group instruction, vocab, anything. I am floored by how brazen it has become.
They also claim he "learns best" from watching movies (which apparently are shown frequently in Science and Social Studies), because he looks engaged and picks things up from them. I think he learns best from instruction, and movies are the only instruction he's currently getting.
The AP exam is still the AP exam—there is no requirement that students take an AP course in order to take and do well on the exam. The College Board started auditing AP courses, because many colleges were giving a bonus grade point to any course labeled "AP", even though many of them were in no way advanced.
Once you introduce auditing, it is hard to keep the educrats out of the process, so delivery methods start getting dictated, even if only content and level was originally intended to be monitored.
As for "critical thinking", I have a blog post about that which readers here may find interesting:
http://gasstationwithoutpumps.wordpress.com/2014/10/26/critical-thinking/
"If the College Board is now simply an outpost of Teachers College, it should say so, out loud. Is this the work of David Coleman? And is the ACT the last man standing?"
Just asking?
I have lots of questions about how the ACT and College Board are trying to reconcile their conflict between the low expectations and money trough of CC and the money trough of colleges using existing calibrations of SAT, ACT, and AP for admission and placement. They can't please both ends. CC does not define a curriculum path for the upper half of the bell curve, but somehow the full bell curve has to be magically scored and continuity provided between K-6 and high school SAT and ACT tests.
ACT is playing the same game. It's trying to make money off of providing CC testing in the lower grades but somehow define a calibrated ACT scoring path to the ACT college version. I saw their attempt at this somewhere (I will have to look it up), but the transition in scoring seemed very nonlinear. Will a middle score on the K-8 (Common Core) ACT be in the same middle of the curve (same number) as the college one, or will they adjust it because the lower grade distribution of test takers is not the same as the fewer number taking the college level test? Or will the middle always be defined as the middle, meaning that as parents and tutors help many kids, the middle will move up? Schools and the CC will take credit for this.
The ACT and college board always run the risk of losing credibility with colleges, and therefore, students. Nobody will take the ACT, SAT or AP classes if they become less meaningful for colleges.
However, many colleges want to see fine distinctions of academics at the upper end of the bell curve. With the higher college demand, many second and third tier colleges are playing the SAT and ACT top 25% game. College admission is not completely holistic. There is the risk that colleges could dump ACT and SAT and just rely on high school GPAs and feedback on the rigor of high schools from their local admissions reps. Many schools have this knowledge already. This is what the SAT and ACT should fear. If they lose their distinctions at the upper end (they are pretty close now), then colleges will just use GPA and holistic criteria. Not only will ACT and the College Board lose the colleges, they will see their lower CC end disintegrate.
For many students, NCLB, and now CC, are meaningless. Students and parents only care about surviving K-6, getting into honors and AP classes in high school, and then doing well on the SAT or the ACT. It's not about bridging some sort of gap between CC and the SAT, it's about ignoring CC from the start.
Ultimately, college admissions drives students, not CC. Parents and students may wander in the desert for K-8, but the rules of the game are clearly laid out by the time they get to high school, and the rules do not include CC.
Does that teacher have any more details? Have the questions on the tests changed, or is this just a way to push a teaching process? One can still test for content knowledge , but push that it should be taught in a particular way. It's one thing to play the audit game and another if the questions on the tests have changed.
The problem with trying to test critical thinking and understanding on a standardized test is that it's very difficult to do. There is too much room for confusion in the questions. I've seen this issue over the years in many of my classes. The teacher asks a leading question in the hope that students will see the key governing idea or equation. (Never mind that if only one or two "get it", the teacher is happy about the process.) Many times, however, the teacher has to keep giving more and more clues until he/she virtually gives away the answer. Do the students not understand, or does the teacher expect them to make connections that seem obvious from his/her experienced viewpoint?
I learned a lot about linear algebra in school, but realized much later when I had to teach the class that I understood little about linear spaces. Understanding is a funny thing, and pedagogues' anal attempts at forcing a top-down "inquiry" approach (they now don't expect them to achieve a "discovery" or light bulb effect?) is, ironically, a very shallow attempt at critical thinking.
Perhaps a teacher can, after following a student over a semester, get a feel for whether he/she understands the material, but it's not reasonable to expect to do that with one standardized test.
This is a general problem with the understanding and critical thinking crowd - that one can test for critical thinking, and worse, that you can provide feedback for how to fix it. I remember one parent/teacher meeting years ago where we analyzed results from our state's NCLB yearly test. It said that our "problem solving" score went down. Their solution was to work harder on problem solving. That was the end of my involvement in these meetings. It would not be helpful for me to say that their process (school and state) is fundamentally flawed. It would provide more useful feedback to just test how students do on the skills portion of the balance that they claim to love so much. The teachers in class can make the subjective evaluation of critical thinking and how to fix it, or are they just potted plants who have to wait a year for standardized tests to give them vague feedback?
In my particular area, the questions have not changed. And it would be hard to come up with questions that specifically measure whether a student had been taught by guide-on-the-sidery or not (even if you were perverse enough to want to).
In math, you can ask questions that are constructed to require facility with a ti89. In science, you can construct free-response items that require a little bit of experimental design. But you can't detect how that knowledge was acquired.
And they can't make the test harder. The median on the free responses is already low enough so that there is plenty of room to distinguish among the test takers. But here is something that I wish they would do: there must be plenty of schools out there whose students consistently beat that median by a large margin. Have they identified those particular schools and examined what methods they are using? Adjust for socio-economic factors and identify who is doing this better. Make your case before you require a particular method for all.
I know one of the complaints about AP is that it is all about memorizing content at a surface level. But I always assumed some amount of writing was required in the non-STEM AP courses, and that writing would be a key part of the exam. Is that true? The reason I am wondering is that one of my students, during advising mentioned that her AP credits in English and history were not showing up correctly on her advising record. We tend not to get many students with AP coursework. In this particular case, though, I knew the student well - had her in two courses. She is an abysmal writer who can't even put a coherent sentence together, let alone a full paper. She must have passed the tests if she has credit. How? It didn't make me think well of AP in those areas
When I took the AP English and history exams, my experience was that acing the multiple choice would offset mediocre essays to an extent. For some reason I could not wrap my brain around the type of analysis required for the AP English Language exam (although I had no trouble with the AP English Lit exam), and my practice essays (as scored by my teacher, who had been an official grader) generally were scored as a 4 or 5 (on a 9 pt scale), but I managed a 4 on the overall exam because I could ace the multiple choice section and so the 5-worthy MC and the 3-worthy essays averaged to a 4.
If she just had a 3 (miniminm passing score), then her essays could have been pretty bad and a really good MC section could still have made up for it. I don't know how common that is, but if she was solid on the material but crap at writing then it's possible.
Post a Comment