Essay-Grading Software Offers Professors a Break By JOHN MARKOFF
Published: April 4, 2013 | New York Times
I'm actually in favor of essay grading software, in theory. I've been interested in automated essay scoring ever since reading Richard Hudson's paper Measuring Maturity in Writing (which I need to re-read, so nothing more on that at the moment):
AbstractMaturity of writing, in this sense, can be measured by software, and I would be using automated scoring software myself if I could buy essay-scoring software on Amazon. EdX says it's giving software away free to 'institutions' (does that leave out individuals?) so I'll have to see if my department might throw its hat in the ring.
The chapter reviews the anglophone research literature on the 'formal' differences (identifiable in terms of grammatical or lexical patterns) between relatively mature and relatively immature writing (where maturity can be defined in terms of independent characteristics including the writer's age and examiners' gradings of quality). The measures involve aspects of vocabulary as well as both broad and detailed patterns of syntax. In vocabulary, maturity correlates not only with familiar measures of lexical diversity, sophistication and density, but also with 'nouniness' (not to be confused with 'nominality'), the proportion of word tokens that are nouns. In syntax, it correlates not only with broad measures such as T-unit length and subordination (versus coordination), but also with the use of more specific patterns such as apposition. At present these measures are empirically grounded but have no satisfactory theoretical explanation, but we can be sure that the eventual explanation will involve mental growth in at least two areas: working memory capacity and knowledge of language.
That said, a lot of this is nonsense:
Anant Agarwal, an electrical engineer who is president of EdX, predicted that the instant-grading software would be a useful pedagogical tool, enabling students to take tests and write essays over and over and improve the quality of their answers. He said the technology would offer distinct advantages over the traditional classroom system, where students often wait days or weeks for grades.None of these things is going to happen. Students aren't going to write essay responses "over and over again;" if they do write essay responses over and over again it's not going to feel like a fun game; and nobody's going to learn to think critically from automated essay scoring software.
[snip]
“It allows students to get immediate feedback on their work, so that learning turns into a game, with students naturally gravitating toward resubmitting the work until they get it right,” said Daphne Koller, a computer scientist and a founder of Coursera.
[snip]
“One of our focuses is to help kids learn how to think critically,” said Victor Vuchic, a program officer at the Hewlett Foundation. “It’s probably impossible to do that with multiple-choice tests. The challenge is that this requires human graders, and so they cost a lot more and they take a lot more time.”
Oy.