kitchen table math, the sequel: more fun with numbers

Monday, April 21, 2008

more fun with numbers

In our non-degree professional development programs at Harvard University, I have taken to routinely asking the assembled administrators and teachers how many of them have taken a basic course on educational measurement. In an audience of 50 to 100 participants, the usual count is two or three. These people are usually “ringers”—they are typically assistant superintendents for measurement and evaluation. That is, they run the testing operation in their school systems. Now, imagine what the state of health care would be if practicing physicians didn’t know how to read EKGs, EEGs or chest x-rays, didn’t know how to interpret a basic blood analyses, or didn’t know anything about the test-retest reliability of these simple diagnostic measures. Imagine what it would be like if your basic family practitioner in a health maintenance organization didn’t know how to interpret a piece of current medical research questioning the validity of the standard test for colo-rectal cancer. Imagine what it would be like to be a practitioner in a health care organization in which every piece of evidence required for patient care came from a standard test of morbidity and mortality administered once a year in the organization. The organization you are imagining is a school system.

Leadership as the Practice of Improvement (pdf fie)
Richard F. Elmore

This probably accounts for the existence of books with titles like: Getting Excited about Data Second Edition: Combining People, Passion, and Proof to Maximize Student Achievement by Edie L. Holcomb.

This text is actually written in language that educators can understand, even if they aren't especially data-savy. The guidelines for incorporating data for school improvement are actually practical and easy to replicate. A great guide for joining the data-driven school improvement movement!
5 stars

In theory, my district is using data to inform instruction. We have a data warehouse.

Thus far, however, the data is apparently showing that all learning problems in the gen-ed population can be attributed to student failure to Seek Extra Help. Either that, or Weak Inferential Thinking.

Which is pretty much what all learning problems were attributed to before we had a data warehouse as far as I know.





more fun with numbers
data-driven instruction redux
data-driven loops & noise

9 comments:

Anonymous said...

Don't even get me started on this one! I'm in a district that places heavy emphasis on being "data driven". In spite of this emphasis, teachers don't have access to much of the data. They are either lacking hardware, priviliges, or timeliness to make it accessible and relevant. This is all before we get to the training issues.

Worse, let's say your data tells you that Johnny is in the sixth grade and can't add, there is no system in place to do anything about it. Sure, you can try to get him to stay late for help or you can differentiate in class (at the expense of what he's supposed to be current with). But, there is no way (especially with 50% of your kids in this condition) to get Johnny remediated.

As long as curriculum fills every inch of available space, teachers aren't going to use objective data to replace subjective data when neither can be used as a force for change.

Catherine Johnson said...

Very interesting --- I'm getting this up front --

I have a question, if you've time to answer.

How much training do teachers and administrators need in order to use "data" to "drive" instruction.

My sense is that the DI folks don't need much; they seem to do short, off-the-cuff assessments to which they can react more or less on the fly. (I assume the curriculum designers use a bit more sophisticated analysis, but maybe not...)

Still, it's obvious that when you get to the level of State Tests you need some people in charge who actually know a thing or two about psychometrics.

Another issue around here: our district bases math & science placement in accelerated and Honors courses on tests that have no validity testing whatsoever. The ELA & Social Studies folks use writing samples. Parents never see the samples, never see a rubric; no evidence is given of whether middle school writing samples do or do not predict success in freshmen courses, etc.

SteveH said...

Many teachers give tests at the beginning of the year to do their own assessment. That should be plenty of data to start with. However, that doesn't mean that the teacher has any time or help to fix the problems. Data has to have a bus if it's going to drive you anywhere.

PaulaV said...

Since it is state test time, you'll see the teachers jump through hoops to get children to simply pass the test. We have SOL clubs after school. Mind you, this is only a month before the test. What I find repulsive is the fact that so many children are put through the ringer at test time.

In fact, a friend of mine knows her son has problems in reading comprehension. She told the teacher at the beginning of the year. Now that it is test time, the teacher wants her son to stay after school to get help. My friend said no because the assessment should have been ongoing. Also, the results from last year's SOL clearly showed he had trouble in reading.

It is sink or swim.

palisadesk said...

How much training do teachers and administrators need in order to use "data" to "drive" instruction.


Pretty well ZILCH is what they get. Most people in education, including administrators and the “expert teams” who come in to help schools do better when they are performing poorly on the tests, know very little about psychometrics or statistics. They cannot explain concepts like validity, reliability, correlation coefficient, standard deviation, effect size, meta-analysis, norm-referenced, criterion-based, classical testing theory, Item Response analysis, and on and on. No one needs to be an expert on statistical analysis, but without an understanding of basic concepts in measurement, people are spinning their wheels and comparing apples to apoplexy.

The only reason I do know something about these topics is that I went to one of the apparently few graduate programs that actually required training in statistics (through the university math department, not the ed school) and then some instructional design and psychometrics followed by practica in using the information in student assessment and program planning. I haven’t come across anyone else at my level (classroom teacher) who has ever learned any of this. Heck, I haven’t come across any of my superiors who have learned any of this. The school psychs have, but that’s their field.

Not all “data” is worth anything – some holistically scored tests are highly subjective and the results lack inter-rater reliability and do not correlate well with more objective assessments.

DI implementations use data in a tight and highly organized way. That's a whole topic in itself. I would't call it "on the fly" though, there are established protocols and procedures in place.

Catherine Johnson said...

They cannot explain concepts like validity, reliability, correlation coefficient, standard deviation, effect size, meta-analysis, norm-referenced, criterion-based, classical testing theory, Item Response analysis, and on and on.

This is the stuff that makes me crazy.

We are CONSTANTLY, now, in my district, being shown our child's "numbers." These numbers, invariably, are used to prove your child's inability to take Earth Science or Honors Biology and on and on and on.

Kids are being given numerical grades, on a scale of 1 to 100, for class discussion.

They're being given numerical grades for classroom "behavior."

How do you get a grade of "81" on classroom behavior?

How is that computed?

Or, as my friend the statistician once put it, "Where did that 1 come from?"

This is another "don't get me started."

A kid I know didn't get into an Honors class he is MORE than able to do. Apparently he did poorly on the "assessment test."

Well, one of the items he missed was a simple equation in one variable. He miscalculated. His mom said, "He knows how to solve equations."

The department chair said, "But he got it wrong."

And that's that.

Bam.

Gotcha.

Department chairs here treat placement as if it were a horse race. Sure it's a photo finish; sure, he can solve equations.

BUT HE GOT THAT PROBLEM WRONG ON OUR TEST SO IT'S OVER.

These are the people running the science program.

Catherine Johnson said...

Assessment here is a HUGE problem. Huge.

Ed talked to a woman who heads the history department at a private school in Manhattan. She told him that assessment is a major, major problem for all of the new teachers they hire.

Catherine Johnson said...

The only reason I do know something about these topics is that I went to one of the apparently few graduate programs that actually required training in statistics (through the university math department, not the ed school) and then some instructional design and psychometrics followed by practica in using the information in student assessment and program planning. I haven’t come across anyone else at my level (classroom teacher) who has ever learned any of this. Heck, I haven’t come across any of my superiors who have learned any of this.

This is making me crazy.

We have department chairs referring to "data" as if it were God's truth.

When I talked to one department chair I discovered she didn't know the difference between a norm-referenced test & a criterion-referenced test. I had to explain it.

These are the people making decisions about our kids' education --- which wouldn't bother me if they erred on the side of letting the kids take more advanced courses.

Instead ALL of the errors are made on the side of keeping kids out of advanced courses.

Across the board, in every case I've seen, the error is made against the child.

Catherine Johnson said...

If you ever have a moment to tell us more about the DI use of data that would be great.

I got the impression that some of their data-taking could be described as "on the fly" from the article on motivation....(and from the fact that I've read Engelmann, several times, advising schools to use very short, quick assessments. Which I realize isn't the same thing as "on the fly.")