Pages

Monday, November 15, 2010

21st century skills

One of the goals of No Child Left Behind is to increase the availability of data. Part of the implicit model underlying No Child Left Behind is that with improved information, parents will recognize good and bad schools. Principals will identify good and bad teachers. District administrators will identify weak and strong principals, and state  administrators will recognize struggling school districts. Armed with this information, parents will choose with their feet, and the other actors will undertake the necessary reforms to improve education.

As an empirical economist I am, of course, sympathetic to the use of data, and as a school board member I pushed for more thorough evaluation of our programs. But the gap between the rhetoric and the ability to use education data effectively is large.

Few school districts have the resources to analyze statistical data in even remotely sophisticated ways. In the early days of the Massachusetts Comprehensive Assessment System (MCAS) tests, I visited the Assistant Superintendent for Curriculum and Instruction who was anxious to use the testing data to help Brookline address its achievement gap. The state Department of Education had provided each district with a CD with the complete results of each student’s MCAS test. In principle, it would be possible to pinpoint the exact questions on which the gap was greatest. The problem was that no one in the central administrative offices could figure out how to read the CD. I loaded the CD onto my laptop and quickly ascertained that the file could be read with Excel. Shortly thereafter, our Assistant Superintendent attended a meeting of her counterparts from the western (generally affluent) suburbs of Boston and discovered that Brookline was the only system that had succeeded in reading the CD. Districts have become somewhat more savvy about using data. A younger generation of administrators has more experience with computers, but relatively few would be able to link student report cards generated by the school district with SAT scores and the state tests.

Principals, district administrators, and even state-level administrators generally begin their careers as teachers, and relatively few teachers have strong backgrounds in statistical reasoning. In my experience, the people who rise to senior administrative positions in public education are smart. They understand in a general sense that estimates come with standard errors attached, but faced with a report that last year 43 percent and this year 56 percent of black students in fourth grade were profifi cient in math, few could tell you whether with 75 students each year, the change was statistically signifificant.

When I stepped down from the school board, one of my colleagues joked that they could all go back to treating correlation as causality. In education policy settings, one repeatedly hears statements like: “Students who take Algebra II in eighth grade meet the profifi ciency standard in grade ten. We must require all students to take Algebra II in eighth grade.” “Students taking math curriculum A and curriculum B get similar math SAT scores. The curricula are equally good.” “Students who are retained in grade continue to fall further behind. Retention is a bad policy.”2

School administrators may understand at some level that they are only looking at  correlations, but almost none have the training to address the issue of causality, and faced with a correlation, they will often interpret it causally in the absence of evidence to the contrary. The capacity to address causality, weaknesses of various measures, and other strengths and weaknesses of statistics is very limited. The Public Schools of Brookline recently recruited for a Director of Data Management and Evaluation. Although school board members generally are not (and should not be) involved in personnel decisions other than those involving the Superintendent, in this specific case the Superintendent asked me to participate in the candidate interviews. Many of the candidates held or had held similar positions in other districts. I asked each candidate how we could decide whether a math curriculum used by some, but not all, of our students was effective. Many of the candidates did not think of this question in statistical terms at all. Only one addressed the issue of selection—and we hired him.

Measurement Matters: Perspectives on Education Policy from an Economist and School Board Member (pdf file)
by Kevin Lang

Apparently they don't cover Excel in ed school.

16 comments:

  1. The context, of course, is affluent suburban schools buying thousands and thousands of dollars of "technology" on grounds that this is the 21st century, and the 21st century will be all about technology.

    In the past decade affluent schools around these parts (Westchester County) have all invented a tenured, unionized "Technology Coordinator" position, too.

    Our technology coordinator, a math teacher, teaches teachers how to use their SMART Boards in one-on-one sessions.

    We have a class size ratio of 1:1 for teachers learning technology.

    ReplyDelete
  2. The problem was that no one in the central administrative offices could figure out how to read the CD. I loaded the CD onto my laptop and quickly ascertained that the file could be read with Excel.


    MWAAAAAHAHAHAHA!

    OK, it'd be funny if it wasn't also frightening.

    ReplyDelete
  3. One, ONE! thought of the problem in statistical terms? I'm gobsmacked. I rather thought that was their job.

    ReplyDelete
  4. Why should we care about opening such files? All they contain are facts. We don't need them, because we understand the concepts.

    Scoff if you will, but ed school research shows that facts aren't important to a progressive educator. Ed schools show us how to rise above facts. They've discovered that the fundamental truths of Progressive Theory can't be grasped until you release your grip on facts.

    ReplyDelete
  5. The issue in our town is that they are looking at the wrong data. They are looking at how to improve differentiated instruction and Everyday Math, not how to replace them. They are looking at relative numbers. At best, the calibration is to the low state cutoff. Data doesn't tell you anything about assumptions.


    I read another article last week on how modern business want kids with 21st century skills; kids who could think and adapt. I realized that this was absolutely false, and it's been false for ages. Businesses only care about the skills you have. If you are lucky enough, they will pay to send you to training and help you manage your career path. Actually, they are managing the solutions to their own needs.

    After I got out of college, I remember spending a lot of time framing my resume in terms of accomplisments; not creating a list of classes I took or programming languages I knew. I learned very quickly, that the first thing they looked at was the list of specific skills I had and asked me questions to find out exactly how good I was in each. Of course they cared whether or not I was a sociopath, but they were trying to fill a slot that required specific content knowledge and skills. Unfortunately, if you have the skills, then the sociopath issue may be overlooked.

    Once you have the job, it's still all about content and skills. I remember one desirable project I got on simply because I knew FORTRAN very well. Many companies have yearly reviews designed to develop their workers. These are all built around developing specific skills and content knowledge. The smart employee will use these events to manage their long-term development of skills related to where the new technology is heading. The company has their own best interests in mind, not yours. You didn't want to become an expert in Vax VMS and find that you've ended up on a dead end job on a legacy computer system. The business does NOT care about potential or critical thinking. They care, and will pay for, content knowledge and skills in high demand areas. The assumption is that you already have the brain power to do what they want. You've proven it by demonstrating your content knowledge and skills.

    If you are careful and see the signs, you can work your way into new areas and have the company pay to send you to training. If not, you can end up overpaid and on a dead-end technology path. You are then a prime candidate for being laid off. It doesn't matter how much critical thinking ability you have. The company can hire a younger person (with a lower salary) who has been exposed to the new technology.

    Companies also want employees who are committed 24/7 to their jobs. There is a real anti-age bias in technology jobs. Many see older people as those with too many outside influences. They are also not up-to-date on the latest technology. It doesn't matter that many of these people have the most experience and critical thinking abilities. So, when people talk about 21st century skills, they really mean the ability to follow the new skills and content. It also means not showing anything on your resume that's more than 10 years old, because it will show them how old you are. 21st century skills have nothing to do with understanding and critical thinking.

    ReplyDelete
  6. It is nice that the issue of statistical significance is being raised publicly. However, it still makes no sense that 'closing the gap' is the issue that is publicized. The fact is that in almost every school, every subgroup (by race of nonbraindamaged children) has not met 100% proficiency even with the basic standards that NY tests in gr 3-8.

    As a parent, it has been obvious that the school day is not providing all of the lessons needed to acheive proficiency to each student in my district. Shucks, here math isn't taught daily in preK to 5, which really shows the lack of priority. These people can run statistics all they want, but until each student has the info and skills necessary for success, success won't happen. Every day has to be meaningful and advance the child toward the goal, whether or not that child is qualified for RtI or not.

    ReplyDelete
  7. Here, I've just visited half a dozen schools that "teach" math and language arts for 2 hours a day, starting in K. By 1st grade, that is up to 2 1/2 hours. And yet, what they teach is anti-learning, because they don't teach skills--they don't teach phonics, they don't teach spelling, they don't teach capitalization, they don't teach basic grammar, and they don't teach math facts, They read on their own endlessly, and write on their feelings endlessly, and the errors are considered showing their author's authenticity. In math, they puzzle through measuring the circumference of a pumpkin or tossing a coin.

    Yet it is not obvious that the school day is not providing lessons needed to proficiency, because they spend so much darn time on these subjects, and the lack of results only becomes clear much later.

    ReplyDelete
  8. It's obvious to those that are looking, but they are being cast as elitists and theives if they speak up. (Elitists for asking for work that fits with the child's instructional level but is too high for the class, theives for wanting money spent on mat'l for honors classes as well as remedial).

    In my area, the state dept of ed's people did tell the principals (at the summer regional conference) that they needed to ensure that the ENTIRE curriculum was taught, not just parts.

    ReplyDelete
  9. "And yet, what they teach is anti-learning, because they don't teach skills-- ..."

    That must have been very frustrating to see over and over again. Do you find any basis for change? Are they open to any input?

    When they look at target state math standards for each grade, do they not see specific skills that have to be mastered? Are they that averse to approaching the problem from a skills-first direction? Deep down, are they really deciding that skills are not important; that they only give lip service to "balance"?

    Around here, I get the feeling that K-6 schools use full inclusion as a "Get out of jail free" card ... as long as they talk the talk and claim that differentiated instruction a work in progress.

    ReplyDelete
  10. --round here, I get the feeling that K-6 schools use full inclusion as a "Get out of jail free" card ... as long as they talk the talk and claim that differentiated instruction a work in progress.

    Steve, this is the truth. This was the hard core center of every place I visited, no matter what covered over it.

    Every school I visited-- private, public, charter, Catholic, didn't matter --ALL of them said that all classroom instruction *and assignments* are differentiated *to meet students where they are at*.

    There was no sense that every child is taught up to a minimum level at all, in any of the schools I visited. So math standards be darned. Really, for every principal and all but one dean I was incapable of getting them to even admit there was a logical problem with their claim of teaching to the standards and their touting of their differentiated instruction. When I tried, they would say "well, if they are in that much need of differentiation, that's what special ed is for".

    (Somewhere else I comment on what "standards" there are for spec ed.)

    Only one middle school (grades 5 - 8) dean of a private school (grades K to 12 in total) was willing to admit the depth of his problems to me re: where the supposedly equivalent classroom instruction with the same texts had led by grade 5. The irony was that this dean didn't EVEN KNOW WHAT A STANDARDS-BASED school was. No, I'm not kidding. But he was at least firmly in reality when it came to what was happening in 5th and 6th grade: "We don't ability group until grade 6" he said "because we have no choice."

    That's right--if they had their druthers, they STILL wouldn't ability group. But by 6th they have to. Why? Because the 5th grade math teacher is totally overwhelmed by the disparity is mastery of the incoming 5th graders, every single year. He personally remediated every student who needed it, because the private school assumes all tutoring responsibility. But it wasn't working at all, and he couldn't bring up the bottom enough. So by 6th, there were 3 ability groups.

    Again--this was in a K-12 school--where they should be able to have a coherent curriculum and inputs and outputs, and the ability to push on lower year teachers and tell them what MUST be done.

    But everywhere else, full inclusion means all you have to do is move the child epsilon above where they are when they arrive. Standards be darned.

    ReplyDelete
  11. There was one another principal who claimed that they used the state test results for individual students to find out what the student needed, and claimed the diff. instr. would then be targeted in that way--however, the state assessments are so low and so weak in math (I don't know the standards in reading/writing) that it wouldn't help. Here, the state math test has been rewritten to involve "explain how you arrived at your answer; explain another way you could have gotten it", etc. And of course, the tests are artificially low, so even though they claim to be testing skill X, the guess-n-check method for n-2 yields the answer.

    ReplyDelete
  12. Allison wrote:
    >Yet it is not obvious that the school day is not providing lessons needed to proficiency, because they spend so much darn time on these subjects, and the lack of results only becomes clear much later.

    Allison, that seems to me a crucial point that, unfortunately, makes a lot of statistical studies in education largely irrelevant (take the well-known problems with HeadStart when viewed with short-term vs. long-term studies).

    I’m currently helping a friend’s daughter who is taking calculus in eleventh grade (not that unusual in the IB school she is at). The girl is quite bright and diligent, and had generally done well in math in the past, but is having some difficulty in calculus.

    The basic problem is that it just takes longer than two, four, or nine months to grasp the ideas of calculus. She and her classmates should have been exposed to the basic concepts of calculus a year or two ago, as early as algebra I (this really can be done – see Sawyer’s “What Is Calculus About?”), to give them time to start mulling the concepts over and start absorbing the general ideas.

    But that would not have shown up as positive results at the end of the year on their Algebra I or Algebra II or Geometry evaluations, and so the teachers did not do it.

    Presumably, the whole point of education is to produce an educated (or, at least, educable) person at age 18 or 22. Yearly evaluations that do not take into account long-term, multi-year results are missing the whole point.

    Dave Miller in Sacramento

    P.S. Sorry for being out of touch for so long – stuff’s been happening. Hope you are doing well.

    ReplyDelete
  13. "Here, the state math test has been rewritten to involve 'explain how you arrived at your answer..."

    Our state tests are the same. Instead of asking students directly to add fractions, they use more involved, multi-part problems. Then, they somehow break out scores for categories like problem solving and numeracy. I have been unable to track down any explanation of how they do this.

    Our schools get the test data broken into all sorts of categories and teachers (and some parents!) sit around and decide what to do to get better scores. When they once saw low numbers for problem solving, the committee decided that the schools needed to spend more time on ... problem solving.

    ReplyDelete
  14. Dave! Glad you're not dead!

    (That's a problem with the web--when someone disappears, you can't be sure they are still alive!)

    Not only do yearly evals miss the point, but as you noted, yearly evals can be gamed at the expense of actually educating someone for the long haul.

    Worse, though, few people have any idea what the long haul should look like anymore. By the time they see the failure, it's so far from the source they can't distinguish the source.

    Dave, wrt the girl you mention, are you sure the issue is the lack of introducing the calc concepts earlier, rather than just a weak curriculum of no depth even in what they did pretend to teach?

    Here's what I'm seeing over and over again: parents are beginning to recognize the complete horizontal slope in difficulty in their children's math curricula between grades 2-7. That is, the difficulty does not increase measurably. They know this is wrong. In a desperate attempt to get rigor into their children's math program, parents then demand acceleration for their children. But accelerating into algebra, geo, trig, and calc earlier and earlier is not the same as providing DEPTH in the curriculum, so the students still lack the mathematical maturity they should have about numbers, fractions, rates, decimals, concepts in algebra, trig, etc. When they hit calc, this weakness becomes apparent, but not before, because calc is where you really have to start putting it all together. You can't get by with perfect procedural fluency and flawed reasoning at this level, even though you could in nearly everything up to this point.

    ReplyDelete
  15. Allison wrote to me:
    > Dave, wrt the girl you mention, are you sure the issue is the lack of introducing the calc concepts earlier, rather than just a weak curriculum of no depth even in what they did pretend to teach?

    Well, I helped her work some problems last week, and she just needed a small amount of nudging – she did basically know the underlying math, but was not quite sure how to use it in calculus.

    In more detail, she seemed to me to have three problems:

    1) The class and the textbook are moving *very* fast. They are using the text by Larson et al., and some of the assigned problems, rather than focusing on the core calculus ideas, require both calculus and a *lot* of algebra. The girl basically knows how to do this, but she is sort of overwhelmed. (The teacher is moving fast partly because he wants the kids prepared for the AP test, which they will be taking well before the end of the school year.)

    For example, one of the problems was to find where the function x * sqrt (x*x + 3) is concave upward and where it is concave downward. That may be a fair problem for the semester final – but it is a bit overwhelming when you have just learned the chain rule, the product rule, the geometric significance of the second derivative, etc.

    2) She actually does know the relevant calculus up to this point in the book – they’ve already had the chain rule and product rule for differentiation, Rolle’s theorem, etc. But she’s still uncertain about a lot of this: ask her a fairly difficult but discrete question, and she probably knows the right answer, but she is not yet sure enough to confidently chain a lot of those right answers together to solve a complex problem.

    3) She (and, I think, most of her classmates) have the intuition to get some things correct without knowing why they are correct and without having been taught formally that they are correct. For example, some of the problems involved looking at inequalities (in the first and second derivatives – obvious things about increasing/decreasing functions, points of inflection, etc.). She correctly understood that to find where a function is positive or negative, you look at points where it is zero *and* at discontinuities and then check one point in each of the resulting intervals (we’re in single variable calculus, of course), but she does not know why this is true. (This goes back to our friend Adrian’s insistence that you cannot learn calculus without learning real analysis. He and I have been sparring about this for some time, but he does have a point.)
    (cont.)

    ReplyDelete
  16. (cont.)

    I was actually fairly impressed with her in that that she is not simply floundering around not knowing what is happening, nor is she just trying to demand “How do I get the answer” without the patience to actually understand what she is doing. But it is just too much for her (or almost anyone) to fully absorb in the time frame she has.

    I’m giving her two main pieces of advice:
    A) She should recognize that she does basically understand the stuff and not panic.
    B) She needs to read ahead, see what is coming at her in the next few months, and prepare a bit so that she is not bowled over.

    I’m particularly concerned about the fundamental theorem of calculus, the various tricks for integration (substitution, integration by parts), and even the basic notations for integration – she did not know what an integral sign was until I showed her. When I explained the difference between the notation for definite vs. indefinite integral, she did immediately, and correctly, point out the connection with the notation used to indicate a function evaluated at a particular point, but she had no inkling of anything at all about integrals until I mentioned it to her.

    She’s bright, but no one has given her a hint at all as to what is coming at her.

    If anyone has any additional ideas as to how to help the kid, let me know. She is an A student who is getting Bs and Cs so far in calculus. I hate to see a kid who is bright and diligent and actually does basically understand the material nonetheless feeling that she is not measuring up.

    Allison also wrote:
    > Here's what I'm seeing over and over again: parents are beginning to recognize the complete horizontal slope in difficulty in their children's math curricula between grades 2-7. That is, the difficulty does not increase measurably.

    Yeah, I agree. I think there is a middle ground here – i.e., do not force even bright kids to take the AP calculus test in sixth grade (!), but do give at least a taste, as early as possible, of more advanced concepts.

    More broadly, starting to grasp the basic concepts and use them in simple problems has to precede by some significant time the achievement of full mastery of those concepts and the use of multiple abstract concepts in very challenging problems.

    Of course, as you and I can attest, the experience that my friend’s daughter is enduring might prepare her psychologically for life at MIT or Caltech!

    Dave

    ReplyDelete