I say "Thank God" because last fall my neighbor managed to track down a Teacher Wraparound Edition of Glencoe Geometry, which she's loaned to me.

I do not have a Teacher Wrapround Edition for Glencoe Algebra. I've been working without a net. In order to check C's homework — which I have to do because here in Irvington Union Free School District (per pupil funding: $22,000) teachers stop collecting and correcting homework around grade 3 — I have to work every homework set myself, check my answers against C's, then figure out which one of us is wrong when the answers don't agree, and, finally, after all that is out of the way & my book deadline has receded further into the distance, have C. re-do the problem(s) in question if he's the one who got them wrong.

I have to do all this because the school refuses to supply answers to kids or their parents.

All the answers are belong to us.

For geometry, I have the answers. (The answers, they are belong to me.)

Boy, does that save time.

I work the problems quickly and efficiently, then I check my answers in the book.

If an answer is wrong, I re-do quickly and efficiently. I can afford to be quick and efficient, because I have the answers.

"speed and accuracy": the KUMON mantra. Have I mentioned the fact that KUMON provides the parent with the answer key? The parent quickly checks her children's work, then has them re-do problems they missed.

Without the answer key, I work problems painstakingly, doubting myself at every turn, sometimes re-working because I think maybe the answer could be wrong. From time to time an answer will come out the same wrong-seeming way 3, 4 times in a row, so that's more time, and in the end I still suffer doubt.

Having to work every problem without an answer key is a huge waste of time. A FWOT, actually. As Carolyn used to say.

So glad the paid professionals who teach my kid don't have to do it.

differentiated instruction 'round Glencoe way

Back on topic. The topic being: differentiated instruction.

I think a lot of us have wondered what differentiated instruction actually is when it's taking place inside the black box.

Sure, sure, I know the concept: if you have 20 students in a class, they are taught in 20 different ways. That is the concept here in Irvington, anyway, or so I gather from the Principal's Message in the Main Street School newsletter.

But what does that look like?

What might those 20 different instructional ways be on any given day?

Tonight, looking up the answers in my neighbor's Glencoe Geometry Teacher Wrapround Edition, I noticed that the book includes, in each Lesson, a "Daily Intervention" labeled "Differentiated Instruction."

Here is the Differentiated Instruction for Lesson 1.4 Angles Measures:

Auditory/Musical A metronome is a tool used to keep a constant tempo in music. It is composed of a pendulum that swings back and forth at varying speeds. The fulcrum of the pendulum acts as a vertex of the angle through which the pendulum swings. Demonstrate this by holding two pens at an angle in one hand and tapping another pen between the first two, creating a series of "ticks."

source: Glencoe Geometry Teacher Wraparound Edition

North Carolina Edition, p. 30

ISBN 0078601789

Daily Intervention Number 2: Inferential Thinking Activity

I am going to infer from the above that:

- it is normal for a high school geometry teacher not to know what a metronome is, what it looks like, or how it works
- a differentiated instruction activity is likely to command the attention of all 20 or 30 students in the room, seeing as how it involves the teacher performing a noisy classroom demonstration
- a differentiated instruction activity accompanying a lesson on angle measures doesn't have to have anything to do with angle measures, necessarily

update from redkudu:

This is not a differentiated strategy. It is an engagement strategy. This is what I talk about when I say there is such little knowledge about the difference between the two.

That clears things up.

redkudu had mentioned this earlier and I didn't know what she meant.

She's absolutely right.

These are engagement strategies, pure and simple -- although they are labeled with the type of learner the "strategy" is supposed to address (auditory/musical; ELL; etc.)

What a mess.

update 2-13-2008: Steve H analyzes the US News & Newsweek rankings of high schools in the Comments thread.

woo hoo, Friday edition

differentiated instruction in action

can you FOIL the answers?

the Gambill method

## 32 comments:

This is not a differentiated strategy. It is an engagement strategy. This is what I talk about when I say there is such little knowledge about the difference between the two.

THANK YOU!

I remember your saying that, and I wanted to know what you meant.

I'll post more of the "Differentiated Instruction" stuff from Glencoe.

Awful.

I wrote up my thoughts on my blog:

http://www.redeemingdaisy.com/redeemingdaisy/wordpress/?p=16

The activity would be made all the more difficult in that many metronomes are now digital. No more little thing that flops back and forth. I couldn't find the old style when I was looking a few years ago, so we have a digital metronome. It hasn't any character, but it is accurate.

"Here is the Differentiated Instruction for Lesson 1.4 Angles Measures:"

What, exactly, are they teaching?

The question is whether the kids get grouped according to their learning styles. The answer is no. All kids are subjected to all of this.

You can always slow the learning process down to achieve better results (I would hope), but that's not what they are talking about with Differentiated Instruction. It's supposed to be better, not better because it's slower.

They don't do this in high school or college. What happens? Is it just a stage? No. Differentiated Instruction is ONLY used as pedagogical cover for full-inclusion. They want to pretend that the differences between kids are related to learning styles, not learning speed.

My son has perfect pitch and plays Mozart piano sonatas. I would never dream of trying to devise a contrived musical way to teach him anything. The direct method works just fine.

By the way, we have one of those pendulum metronomes. It's ancient.

They don't do this in high school or college.Yes, they do.

Ask redkudu.

The Freshman Honors English class here, recently, gave students the "creative option" of writing a 2-page paper instead of a 5-page paper (I believe it was 5) and drawing an illustration.

My friend's son opted to draw the illustration. After he'd finished he (or his mom) discovered that, lo and behold, he had drawn his illustration on the wrong sized paper.

He' drawn it on drawing paper; the requirment was poster board. (My friend says it's harder in some way to draw in a smaller format - do I have that right? At any rate, she used the term "philistines" to describe a class requirement that her son draw BIG instead of SMALL.)

So my friend sent her son to school equipped with poster board & crayons for labeling.

This is Honors English, freshman year, in a high school eduwonk ranks in the top 100 of the country.

They are also being required to compare books to movies.

They compared Catcher in the Rye to a contemporary teen movie.

apples and oranges

Ed says that the obsession with interdisciplinary teaching results in a focus on "non-comparison comparisons."

The problem with ability grouping in high school is that we are expected to do it, but given no useful information about our students to use in doing so. When a student comes to me I have no information whatsoever about their reading level or ability, but I should be able to intuit what their obstacles are and effectively place them, then devise effective differentiation strategies based on a host of strategies presented by other teachers in my school that they claim are differentiation strategies, but really are engagement strategies because they, too, have no idea.

"...we are expected to do it..."

The teacher? Are you talking about differentiation within one class?

I thought that the tracks or phases took care of that. Our high school has a regular geometry class and an honors geometry class. I would assume that the teacher would not have to differentiate within a class.

If you are talking about engagement strategies rather than content differentiation, I can see, but that shouldn't require separation or placement of a student, should it?

>>The teacher? Are you talking about differentiation within one class?<<

Yes. It's the new push this year. Teachers should differentiate in the classroom for the students in that class. That's where the problem lies, and where we keep getting these well-meaning teachers sharing all their "differentiation" techniques which are really engagement tools. Because, as is so often the case, with new directives and expectations comes zero money to train teachers in the very strategies they are expected to use. So every teacher who has ever had success with things like "Calculus-thenics" or something involving coloring thinks they've developed a fantastic differentiated instruction technique because a) all the students will do it, and b) all the students can be successful at it, supposedly.

Well, of course they can. They learned coloring in kindergarten, and every year since.

Our high school is heading towards doing away with separate Honors track courses.

They're pretty close now.

AP courses are open enrollment & have been for quite awhile; that's why we rank so highly on the two high school rankings.

My district has so badly abused the notion of ability grouping and tracking that, absent major reforms in transparency, flexibility, and the creation of a parent veto or override, I would vote in favor of eliminating ability grouping at the high school.

There's a huge amount of wildly mixed-ability grouping in high schools.

"AP courses are open enrollment & have been for quite awhile; that's why we rank so highly on the two high school rankings."

What percent take the AP test? Do they cover the proper material? Can they legally call them AP classes? For open enrollment, they would either have to dumb down the material or flunk kids.

Our high school states explicitly the minimum grade you have to get in the prerequisite course. Up to now, my only concern has been the quality of the teaching, not the content. I'll have to start asking other parents around here.

Irvington AP scores for spring 2007:

Number advanced placement courses offered: 19

Number of students taking AP tests: 410

Number passing: 266

Number enrolled in AP courses: 600

Percent of AP students taking test: 68%

Percent passing: 65%

Percent of all potential test-takers taking AP passing test: 44%

NOTE: These figures are for "enrollments" in AP courses, not actual student number. If a student takes 3 AP courses, he or she is counted 3 times.

Here are our figures for AP calculus (email from the math chair) class size: approximately 150:

Currently, we have 14 in BC Calculus and 19 in AB Calculus.Last year, we had 13 in BC Calculus and 100% of the students passed the AP exam. We had 16 in AB Calculus and 56% of the students passed the AP exam.

Here's what Betsy Newmark, who teaches AP History in a college prep charter school in NC, had to say about our scores:

I would have to know about your district - what type of population you have. Why are so many opting not to take the tests - are they just signing up for the honors point. Why are so many failing of the ones who take it? Do they have any screening for who can sign up for an AP class?

Is this a poor district with lots of kids from poor socio-economic backgrounds? If so, then this doesn't seem that surprising. If it's mostly a middle class district with kids with educated parents, this does seem pretty low.

[snip]

At our charter school which is basically middle class, almost all the kids who take AP classes, take the tests. I've only had a handful in the past 5 years not take the exams out of a few hundred kids. And, overall, we have over 95% pass.[snip]

Almost every kid takes at least one AP class - probably close to 85%. But then we're selling ourselves as a college prep school so the type of parent who would have a kid taking AP chooses to send their kids to our school. That's why we rank highly on Newsweek's numbers - because we have such a high participation rate. That's not the average for most schools...........

I've corresponded with Andy Rotherham, who did the U.S. News ranking that puts Irvington High School in the Top 100 high schools in the country. He says our school, with these scores, could definitely make it into the Top 100.

I'd like to point out the obvious, which is that our high school is educating kids who've just spent 3 years in our middle school.

On the other hand, our high school, many parents believe, has been allowed to demote teachers who aren't working out at the high school to the middle school. Assuming that is true (I believe that it is), the high school has been reaping what it sowed to some degree.

(Of course, under our current superintendent, star teachers at the high school can also be demoted to the middle school for reasons unknown.)

This is for the US News high school ranking. (By the way, 10 states either did not supply the numbers or did not supply the right numbers to be included. So it should be called the best high schools in 40 states.)

The ranking consists of three parts. The first and second parts look at state standardized test scores. The first part judges how well a school does compared to other schools in their SES category. The second part judges how well the minority students do as a group by themselves. If you "pass" these first two tests, you go to the third test which awards gold or silver medal status based on an examination of AP test data.

This third part is broken into two sections. The first is based on simple participation numbers in AP classes. This is weighted only 25 percent, but could hurt you in the second ("quality-adjusted") section which looks at how many pass (3 or higher) the AP test.

"For the college readiness index [ed. third part], the quality-adjusted AP participation rates were weighted 75 percent in the calculation, and 25 percent of the weight was placed on the simple AP participation rate. Only schools that had values greater than 20 in their college readiness index scored high enough to meet this criterion for gold medal selection. The minimum of 20 was used since it represents what it would take to have a 'critical mass' of students gaining access to college-level coursework."

Both of these numbers are divided by the total number of kids in the 12th grade. They don't look at what percent of the kids taking AP classes pass the AP test.

So, this formula encourages high schools to have open enrollment, but they can't slack off on the material because it is heavily weighted on how many pass the AP test. It doesn't hurt the school if you take an AP course and fail.

I find it interesting that they focus only on 12th graders and they don't care how many AP classes each student takes. If you take two AP classes and pass only one test, then that doesn't hurt the school. Only you.

I like the idea of open enrollment unless the school gives you no real alternative. A 25 percent weighting factor is still valuable if there is no downside for the school. Our high school has both college prep level courses AND AP courses. This formula might cause them to try to eliminate the college prep course and give you no choice but to go into the AP course.

The problem with AP courses is that they aren't (shouldn't be) necessary to go to any college for any major. In fact, many colleges won't give you advanced placement, and if offered, you should probably decline. Being prepared for college shouldn't mean advanced placement, but that's what this formula uses.

Unfortunately, the AP courses and tracks in high schools define the only rigorous paths to college. If you aren't on those tracks, you're nowhere. From this standpoint, I disagree with the US News formula. The first two parts are based on some minimal state test hurdles to see if you can go on to the AP test round, and the AP round looks only at the top students. There is a really big gap in the middle of students who would like to go to college. We don't know how they are doing.

I could argue that the kids who pass the AP test do so in spite of what the school does. You would have to interview all students to see what they (and their parents) did over the years to get to that point. What about all of the "middle" kids? The ones who ended up in check-book math because the schools really screwed up in K-8.

There is no incentive for schools to fix that. The incentive is to place each student into at least one AP track where they can get a '3'. Perhaps this is good for some students who will focus on being good at something, like music or art. This doesn't fix what I call the "middle", or the all-or-nothing problem, but it does focus the school on providing many different avenues for success.

There are a couple of things to note here.

1.

First of all, Irvington High School tells you all you need to know about the two high school rankings.

The point of these rankings is to force public high schools to allow minority students to take AP courses.

Jay Mathews says that he decided to rank schools based on "inputs" (open enrollment), not "outputs" (passing scores on AP tests) because of a study showing that kids who took AP courses and failed the test still did better in college than kids who took no AP courses. (He doesn't use the terms "inputs" and "outputs.")

I don't know whether that study looked at kids who had taken AP courses under an open enrollment policy. I need to find out.

Andy Rotherham, I believe, was trying to correct a bit for the pure input approach; his ranking requires that schools make it through an initial judgment that kids in the school are doing better than students in SES-matched schools.

After that the criteria had to do with open enrollment and with number of students passing just one AP test.

2.

Look at those math scores.

In this extremely high-SES community, with many, many parents holding advanced degrees, we have only 14 kids making it all the way to BC calculus. (My neighbor says there were only 12 kids in the previous year's class.)

I can tell you from experience that those kids are mathematically gifted. Maybe not all of them, but close to. Just counting off the top of my head now, in Chris' class there are at least 6 "bulletproof" math brains. They ace tests even when the material has never been taught; we know this because we've had the experience.

So: BC calculus is taught to the mathematically gifted or the near-gifted and they pass the AP test.

That hardy little band of 19 kids who signed up for AB calculus fared badly. 44% fail the AP test, and we have no idea whether any of these kids got 4s or 5s, or whether they all squeaked by with 3s.

This is C's group, and these are very, very bright kids. I'm sorry to sound boastful -- I wouldn't do it normally, but C's ISEE score on reading comprehension pretty much sealed the deal as far as telling us what his native ability is.

Also, that little group of 19 kids doesn't exhaust the sum total of brainy kids in the district. This is a self-selected population; the kids are

verysmart.That little group of 19 represents the kids who haven't given up -- or whose parents haven't given up. There are many, many more kids in the student body who are capable of taking AB calculus, and who will need to take calculus when they get to college 1 year later.

These AP calculus scores are the endpoint of an Irvington School District mathematics education.

If you're gifted, you do fine.

If you're intelligent, motivated, and hard working you don't.

3.

Pundits and reformers don't listen to parents any more than educators do.

I admire Jay Mathews and Andy Rotherham; I read what they have to say; I learn from their columns & white papers.

But leaving school reform up to Jay Mathews and Andy Rotherham isn't an option.

They don't look at what percent of the kids taking AP classes pass the AP test.Right.

I haven't had time to read the material on Rotherham's ranking -- what do you make of Irvington's position in the Top 100?

I'm fairly sure that we had no black students in the class in question. (But don't take this as gospel. I need to check.)

I assume that passing an AP test keeps a school honest, but we don't know how much of the credit belongs to the school and how much belongs to the student. There is a huge incentive to get all students to take one AP course as a senior.

If I were a high school administrator and wanted to play the game, I would offer as many different kinds of AP classes as possible (with open enrollment) and make sure they really couldn't take them before 12th grade. I would discourage kids from taking many AP courses, because I really need them to pass just one. I wouldn't care to "fix" the math curriculum if I could get that poor math student on an AP Music path. And I get a 25% bonus for each AP student no matter whether they take the test or flunk.

we don't know how much of the credit belongs to the school and how much belongs to the studentA friend tells me she estimates as many as 90% of the AP kids are being tutored.

One of the AP teachers opens the year by announcing to the class: "I don't do test prep."

My friend's kid spent the year cramming for the AP test online & now loathes the subject.

oops - I misspoke above.

44% of the kids in AB calculus failed the test; of the 56% who passed we don't know how many earned 3s, 4s, and 5s

"Jay Mathews says that he decided to rank schools based on "inputs" (open enrollment), not "outputs" (passing scores on AP tests) because of a study showing that kids who took AP courses and failed the test still did better in college than kids who took no AP courses."

But then you have the incentive to force kids into AP classes when another class would be more appropriate. You can't let statistics drive policy. I get the feeling that many who want to solve the problems of education don't like details.

I find it interesting that they focus only on 12th graders and they don't care how many AP classes each student takes. If you take two AP classes and pass only one test, then that doesn't hurt the school. Only you.Interesting.

Our district, once kids reach 6th grade, functions as a kind of war of attrition against ambitious students and their parents.

ALTHOUGH -- I just had the most amazing conversation with a mom of one of our beleaguered accelerated math students. Her son has been getting Cs and Ds on tests.

She told me she

triedto get her son moved to the regular track and the school refused!That's the first time I have EVER heard of our school refusing to move a kid down. (This was last school year; this boy is now earning quarter grades of C. You can't continue in the accelerated track with grades of C; she was right to try to move him down.)

I could hardly believe my ears.

In any case, I wonder whether, by senior year, we have fewer kids taking AP courses than we do in the junior year.

That would be interesting to know.

If I were a high school administrator and wanted to play the game, I would offer as many different kinds of AP classes as possible (with open enrollment)That's what Ed said.

What galled me about the U.S. News ranking is that it appeared not long after we got word here that every single black and Hispanic student in the district failed the 8th grade ELA & math tests last school year (2006-2007).

I was the only parent to speak up about this publicly, which made me even more of a target than I already was. Some of you may remember the hostile Comment left on the "Irvington thread" to the effect that black and Hispanic scores "have nothing to do with you." I'm quite sure that comment was left by an Irvington teacher. A middle school teacher, I suspect.

So: I was the one person publicly protesting the fact that every black and Hispanic student in the 8th grade failed the tests.

Then Andy Rotherham comes out with his "Gold" ranking for Irvington and we're deluged with terrific press coverage, allowing our administrators to take their victory laps around the track, modestly disavowing "credit" for this great achievement, etc.

When I emailed our 8th grade scores to Rotherham, he took a look but didn't seem bothered by the fact that he'd just given the U.S. News Seal of Approval to a wealthy white district where black students don't appear to be doing very well.

Nor did he seem bothered by the fact that he had just undercut and discredited the one parent trying to do what Rotherham is trying to do, which is to push schools to assume responsibility for student achievement.

I'm going to indulge myself in a poor-me complaint.

Andy Rotherham travels in elite circles, issues white papers, works in the White House, blah, blah, blah.

I've tackled a small school district in a tiny town while having 3 kids enrolled in the schools.

When I say we are the "boots on the ground," I'm not kidding.

Folks like Rotherham ought to have some concern for folks like me.

Change isn't going to come from the top.

But then you have the incentive to force kids into AP classes when another class would be more appropriate.This is where the socioeconomics of the town come into play.

95% of Irvington students enroll in college. 10% of the student population, almost by definition, has to be classified SPED, which means that half our SPED kids enroll in college.

In other words, effectively every student in the high school plans to attend college.

You don't have to force kids to enroll in AP here.

Our district's problem is always the exact opposite: how to keep kids

outof accelerated & Honors courses."So: I was the one person publicly protesting the fact that every black and Hispanic student in the 8th grade failed the tests."

But what happens in high school? Part two of the US News formula is supposed to account for this, but we just have to take their word for it. I don't see the formula. My guess is that is has to do with meeting the state's definition of proficiency and absolutely nothing to do with being prepared for college. So, between the state's very low idea of proficiency and a passing grade on any one AP test, we have nothing. Toss a bone to the poor and minority students, but give the gold medal out to schools with the most talented kids or active parents.

I have a lot of experience creating formulas that presume to measure the value of a system. (measures of merit) In this case, the available data is slim and they make too many assumptions. Since it must be a complicated-looking formula with lots of variables and weighting functions (fudge factors), it has to be scientific and accurate. No.

I create these sorts of formulas for engineering design. They are useful tools for learning about relationships or relative value, but they should not be used for absolute design. It sounds contradictory, but it isn't.

Formulas, like that of US News, might give you some general or relative understanding of a problem, but the danger comes when you use that formula to drive real changes to the system. It's an analysis tool, not a design tool. One of my formulas is very useful as a learning tool, but top designers don't use it. The real world is not that simple.

The sort of analysis done by US News is incredibly simplistic and ignores many variables. Of course they are limited by the data that is available, but that's no excuse for declaring that it can define the top 100 high schools in the US (40 states). There is no justification for then working backwards from the formula to construct an optimum school. The optimum you find might be a small bump compared to the huge optimum you might find if you do something quite different.

In terms of measures of merit and finding optimum solutions, the problem is that you will find a bad local optimum, not the huge global optimum.

In terms of the value of interdisciplinary work, the ed school world needs a huge dose of math and engineering.

Formulas, like that of US News, might give you some general or relative understanding of a problem, but the danger comes when you use that formula to drive real changes to the system. It's an analysis tool, not a design tool. One of my formulas is very useful as a learning tool, but top designers don't use it. The real world is not that simple.I've got to get your comments up front.

Here's a question I have.

Do you feel that the US News & Newsweek formulas serve as a useful analysis?

Can readers of these magazines conclude anything useful about these schools apart from the fact that the student population is affluent and almost entirely white?

And: at what level do these rankings serve as useful analyses?

In other words, say you're a parent looking to move to a district with good schools.

Could you base a decision in these rankings?

"In other words, say you're a parent looking to move to a district with good schools. Could you base a decision in these rankings?"

"Base a decision"? No. Is it of no value? No, but it's all relative. Schools think it's important because it's good PR. Our high school has a reference to it on its home page, and it's only a silver medal!

Over the long run, schools can take advantage of the formula. Anytime you have an extremely important formula that condenses a lot of information down into a single number, it's open to gamesmanship. But, the more that schools play the game, the less useful is the formula. I've seen cases where it's a constant arms race between those who want a formula to reflect reality and those who want to beat the formula.

If a school pushes all students to one AP class or another whether or not they are properly prepared, then that might improve the score, but the education might not be better. Or, it could be a false or local optimum and they completely miss a much larger global optimum that uses a different approach.

Important formulas generally force a trend towards one particular solution. Uncertainties in the formula can hide other solutions that might offer much better results. Instead of reflecting reality, they drive reality.

Think of a formula that tries to represent a topographic map. You want to find the highest point on the map but all you can do is plug in your latitude and longitude into a black box that will give you a height. You keep doing this and try to search for the highest point. If your second point gives a lower height, you turn around and go in the other direction and check the height.

This is called non-linear optimization and I have many books that discuss solutions to this problem. If you know derivative information, you can search faster. If not, you can find slopes numerically.

The problem is that you might find a mountain peak, but it's the shortest peak of the mountain range. Another problem is that the black box might not represent reality very well. There might not be a mountain peak there at all.

Another, more subtle issue is that (due to uncertainty) the very highest peak location is no better than a location that is 10 percent lower. If you think of a long mountain range, it might be much easier to climb to the slightly lower end of the mountain range than it is to climb the other end where the absolute highest point is located. In other words, an easier optimum to a problem (within 10 percent) might be located far, far away.

Rather than push all high school kids onto at least one AP track in high school, the easier approach might be to fix the problems in K-6 before they get large. They won't find this solution if they are focused on a formula that uses only high school data.

Think of a formula that tries to represent a topographic map. You want to find the highest point on the map but all you can do is plug in your latitude and longitude into a black box that will give you a height. You keep doing this and try to search for the highest point. If your second point gives a lower height, you turn around and go in the other direction and check the height.Damn!

I don't understand this.

And I want to.

So...you're talking about a situation where the "searcher" really knows nothing at all about mountain peaks, where they are, where the highest peak might be, etc.

The only way to look for that peak is to plug longitude and latitude into the black box and see what comes out.

The only useful feedback you get is when the black box produces a "miss" because you've entered a longitude and latitude that correspond to a lower altitude.

Then you decide, in effect, that you've gotten "colder."

OK, here's a question.

Is the "hot-and-cold" game an example of non-linear optimization?

Another problem is that the black box might not represent reality very well. There might not be a mountain peak there at all.This is my question when it comes to school rankings.

In the case of the US News formula, it's not completely a black box. You can see the formula (up to a point) and know what the variables are. But you don't necessarily know how to change the underlying variables or causes to find a better solution. It can be a hot/cold type of search. The formula tends to have you focus on the variables in the formula, like the number of kids taking AP classes, rather than poor K-6 math curricula.

Non-linear optimization is used with formulas that reduce complex systems with lots of variables into one number or ranking. With simple functions, like parabolas, you can take the derivative (slope) of the function and set to zero. This is either a maximum or a minimum point. With more complex formulas, like the one used by US News, you can't do this. There are too many variables.

The formula is based partly on how many kids pass one AP test. How do you improve that? Guess and check? Or, maybe you just force many kids to take some sort of AP test just to get the 25% bonus for showing up at the gate. Your rating goes up, but not the education.

If you focus on the formula, then you will miss all sorts of other possibilities that might not pay off right away. The formula reflects the problem, but it's not the problem schools have to solve. If they solve the real problem, the formula will take care of itself. You can do things to fool the formula, but that's not what they should be doing.

Post a Comment