“edX, a nonprofit enterprise founded by Harvard and the Massachusetts Institute of Technology, will release automated software that uses artificial intelligence to grade student essays and short written answers.” John Markoff, New York Times
There has been much discussion this week among educators about the idea of robo-grading, or machine grading, prompted by the New York Times article Essay Grading Software Gives Professors a Break of which the quote above is an excerpt. To date over 1,000 comments posted to the article, most vehemently opposing the idea of automated grading. Quite by coincidence, I posted an article on this blog, Four Reasons Why we Need Instructor Feedback in Online Courses that emphasizes the value of instructor feedback specifically in online courses—and I stressed why MOOCs won’t cut it.
My argument is that undergraduate students need constructive and specific feedback to develop their writing and critical thinking skills, and a massive course such as a MOOC cannot provide it. My view contrasts starkly with the president of edX, Dr. Agarwal. Agarwal is convinced that students can learn from, and develop writing skills in a MOOC setting with feedback via automated grading. It’s the immediate feedback that is useful states Agarwal, and that students are able to “take tests and write essays over and over and improve the quality of their answers” (Markoff, 2013). Hmmm—while I do agree that immediate feedback supports the conditions required for learning, I don’t see students being motivated to rewrite an essay again and again.
How Does Automated Grading Affect Student Motivation?
In response to the NYT article, Elijah Mayfield, founder of LightSIDE Labs, developed a computer program that uses “machine learning to automatically assess written text“. Mayfield wrote a post for e-Literate discounting the claims outlined in the NYT article which generated over 50 comments, mostly from university professors opposing the robo-grader concept. I have minimal experience with machine grading, and my comments to Mayfield’s post took a different (perhaps less informed) approach, focusing more on the conditions of learning. The concerns I have focus on students perception and their willingness to consider automated grading as valuable. Also its effect on student motivation, thus potential learning. Two of my recent posts, here and here, reference research studies that support explanatory and constructive feedback from instructors.
Below is the comment I posted in response to Mayfield’s post Six Way the edX Announcement Gets Automated Essay Grading Wrong on e-Literate.
Thank you Elijah for this in depth post. Questions I have-how do students perceive machine grading? And how much research has been done on the impact on learning performance and motivation?
I wonder what the implications are (or will be) on students’ motivation, and quality of their effort and work? Students spend time on writing essays, some more than others, yet for students to know that a real person will not be reading their essay, could impact many processes. My teenagers have been exposed to automated grading periodically at their high school and they both strongly dislike it (despise it is a more fitting term). They discount its value completely. I predict that teenagers and young college students will not be receptive to this type of grading. Why should they spend hours researching, writing and re-writing an essay when they know no one ( a real person) will even read it? Even more so in a MOOC that is not for credit, why on earth would you write an essay for an automated grader?
For large-scale classes, as you discuss in your post, peer grading would be a far more valuable exercise and learning experience for students than machine grading. Two studies I have read show that there is 20 to 25% grade inflation with peer grading, but the learning for sides, peer and student is far more meaningful in my opinion.
I am all for technological advancements, yet at some point are we not going too far, and when will that be? (A rhetorical question). However, I do look forward to reading further and learning more about this method. Thank you for the thought-provoking post. Debbie
Response from Elijah Mayfield:
Debbie – There are mixed results in the literature, but most of all they point to a negative impression from students if they’re working purely alone, even if writing skill does go up. However, if automated technology is being used in a collaborative setting, scaffolding the interaction, we see almost the opposite effect – compared to a control it increases student satisfaction with the learning experience, and their own self-efficacy, even if the learning gains on top of that collaborative process are modest…
Mayfield’s response is fair and honest, and I appreciate his willingness to engage in discussion with readers that commented and expressed skepticism, if not criticism of his program. I encourage readers that are interested in learning more about the topic to read the post and the discussion that follows it.
Let’s Think about This More…
I want to learn more about the idea of machine grading, and am eager to review feedback from students after edX implements its grading software that Agarwal speaks of in the NYT article. Though I remain skeptical—I’m keeping my mind open. As mentioned, I am most concerned about its implications on student motivation, and the potential long-term effects on learning should machine grading become the norm. There is an emotional side to this story, the idea of students making personal connections and feel that their writing is of value when writing to a real person. Can the joy of writing be fostered when writing for a machine?
- Six Way the edX Announcement Gets Automated Essay Grading Wrong, Elijah Mayfield, (2013), e-Literate
- Robots Eyes as Good as Humans when Grading Essays, Melissa Block, (2012), NPR
- Tossing Sabots into the Automated Essay Grading Machine, (2012), Audrey Watters
- Four Reasons Why Students Need Instructor Feedback in Online Courses, Online Learning Insights
- Better Tests, More Writing, Deeper Learning, (2012), GettingSmarter.com
- Essay-Grading Software Offers Professors a Break,(2013), John Markoff, New York Times
Image credit: Mike Licht, NotionsCapital.com’s photostream (Flickr)