Giving Feedback to Students: Instructor vs. Machine

“edX, a nonprofit enterprise founded by Harvard and the Massachusetts Institute of Technology, will release automated software that uses artificial intelligence to grade student essays and short written answers.”  John Markoff, New York Times

T-Pain's Singing Teacher

There has been much discussion this week among educators about the idea of robo-grading, or machine grading, prompted by the New York Times article Essay Grading Software Gives Professors a Break of which the quote above is an excerpt. To date over 1,000 comments posted to the article, most vehemently opposing the idea of automated grading. Quite by coincidence, I posted an article on this blog, Four Reasons Why we Need Instructor Feedback in Online Courses that emphasizes the value of instructor feedback specifically in online courses—and I stressed why MOOCs won’t cut it. 

My argument is that undergraduate students need constructive and specific feedback to develop their writing and critical thinking skills, and a massive course such as a MOOC cannot provide it. My view contrasts starkly with the president of edX, Dr. Agarwal.  Agarwal is convinced that students can learn from, and develop writing skills in a MOOC setting with feedback via automated grading.  It’s the immediate feedback that is useful states Agarwal, and that students are able to “take tests and write essays over and over and improve the quality of their answers” (Markoff, 2013). Hmmm—while I do agree that immediate feedback supports the conditions required for learning, I don’t see students being motivated to rewrite an essay again and again.

How Does Automated Grading Affect Student Motivation?

In response to the NYT article, Elijah Mayfield, founder of LightSIDE Labs, developed a computer program that uses “machine learning to automatically assess written text“.  Mayfield wrote a post for e-Literate discounting the claims outlined in the NYT article which generated over 50 comments, mostly from university professors opposing the robo-grader concept. I have minimal experience with machine grading, and my comments to Mayfield’s post took a different (perhaps less informed) approach, focusing more on the conditions of learning. The concerns I have focus on students perception and their willingness to consider automated grading as valuable. Also its effect on student motivation, thus potential learning. Two of my recent posts, here and here, reference research studies that support explanatory and constructive feedback from instructors.

Below is the comment I posted in response to Mayfield’s post Six Way the edX Announcement Gets Automated Essay Grading Wrong on e-Literate.

Thank you Elijah for this in depth post. Questions I have-how do students perceive machine grading? And how much research has been done on the impact on learning performance and motivation?

I wonder what the implications are (or will be) on students’ motivation, and quality of their effort and work? Students spend time on writing essays, some more than others, yet for students to know that a real person will not be reading their essay, could impact many processes. My teenagers have been exposed to automated grading periodically at their high school and they both strongly dislike it (despise it is a more fitting term). They discount its value completely. I predict that teenagers and young college students will not be receptive to this type of grading. Why should they spend hours researching, writing and re-writing an essay when they know no one ( a real person) will even read it? Even more so in a MOOC that is not for credit, why on earth would you write an essay for an automated grader?

For large-scale classes, as you discuss in your post, peer grading would be a far more valuable exercise and learning experience for students than machine grading. Two studies I have read show that there is 20 to 25% grade inflation with peer grading, but the learning for sides, peer and student is far more meaningful in my opinion.

I am all for technological advancements, yet at some point are we not going too far, and when will that be? (A rhetorical question). However, I do look forward to reading further and learning more about this method. Thank you for the thought-provoking post. Debbie

Response from Elijah Mayfield:

Debbie – There are mixed results in the literature, but most of all they point to a negative impression from students if they’re working purely alone, even if writing skill does go up. However, if automated technology is being used in a collaborative setting, scaffolding the interaction, we see almost the opposite effect – compared to a control it increases student satisfaction with the learning experience, and their own self-efficacy, even if the learning gains on top of that collaborative process are modest…

Mayfield’s response is fair and honest, and I appreciate his willingness to engage in discussion with readers that commented and expressed skepticism, if not criticism of his program. I encourage readers that are interested in learning more about the topic to read the post and the discussion that follows it.

Let’s Think about This More…

I want to learn more about the idea of machine grading, and am eager to review feedback from students after edX implements its grading software that Agarwal speaks of in the NYT article. Though I remain skeptical—I’m keeping my mind open. As mentioned, I am most concerned about its implications on student motivation, and the potential long-term effects on learning should machine grading become the norm. There is an emotional side to this story, the idea of students making personal connections and feel that their writing is of value when writing to a real person. Can the joy of writing be fostered when writing for a machine?

Further Reading:

Image credit: Mike Licht,’s photostream (Flickr)

‘Speaking to Students’ with Audio Feedback in Online Courses

In this post I’ll share how to give meaningful and constructive feedback to students on assignments, presentations, and other works by using voice recorded files.

imagesResearch suggests that students want specific and detailed feedback from their instructors (Balaji & Chakrabarti, 2010 ). Who wouldn’t? It is disappointing to students to receive few or no comments from their instructor after investing hours researching and writing a paper.  Even more disconcerting to some students, is receiving a below par grade with little explanation or constructive feedback—in online classes even more so given the lack of personal contact. Which is why in online classes voice feedback is much appreciated by students; most students welcome this type of response. I had one professor in grad school that provided audio feedback on our assignments, which I appreciated, and looked forward to very much even if it was not all positive. Not only did five minutes of feedback pack a lot of punch, but it felt personal, and I found myself putting extra effort into all assignments in his class. This is not to say written feedback is not valued, but voice is particularly impactful in our text-based world. Given the value to students, the time-saving  benefits to instructors, and the new tools that are easy to use, I suggest all instructors consider speaking feedback to students using recorded voice files. I’ll share how here.

Results of ‘Explanatory Feedback’ Study at Duke University
I won’t elaborate here about the value of feedback, as I delved into this topic in my last post. However before I identify feedback tools and methods I would like to share a study published recently in the Journal of Educational Psychology regarding the transfer of learning that occurs with certain types of feedback (Butler, Godbole & Marsh, 2012). The study examined the transfer of learning that occurred (or lack thereof) with three types of feedback, 1) correct answer, 2) no feedback, and 3) explanatory feedback. The learning that students absorbed was measured in three steps, recognition, recall, and application.  Results demonstrated that correct answer and explanatory feedback provided the recall level of learning among students, but explanation feedback enabled learners to better comprehend the concepts, and facilitate deeper comprehension by being able to apply the knowledge to new contexts. Next we cover how to give explanatory feedback that is rich and detailed, and goes beyond the robot grader.

The Method to Giving Audio Feedback
There may be a short learning curve to providing this type of feedback to students, but it’s very short. One might feel self-conscious at first,  but after one or two recordings, it becomes far more comfortable (Using Audio Feedback Case Study, 2010, YouTube). And the voice recording does not need to be polished or perfect—pauses are okay according to a professor at the University of New South Wales who describes the method and tool he uses in the case study video [ Learning to Teach Online series]. Though the tool he uses in the video is outdated, the method is not. In this scenario, the professor provides feedback in a voice recording after he reviews the student assignment, usually the assignment is read online or onscreen—no hard copies. The professor makes a few notes, while reviewing the student’s work, records his feedback immediately, and sends it to the student.

The Tools for Providing Audio Feedback
What better way then to provide personalized feedback than with audio. I’ve reviewed two tools below, though there are several more, the ones here are easy to use — record and send.

1) Voice Recorder App on Smart Phone. There are many apps available that are free.  I chose Voice Record Pro, form the iTunes store as it has a 4.5/5 rating. It’s easy to use. I simply open the app, hit record, then stop when I’m done, and send.

Before sending the file, I can listen to it, delete it, or save it in Dropbox, SkyDrive [other options available}.  When ready, with one click it can be sent to the student. The file is in a mp4 format which the student can download and then listen to. Easy. And the copy of the file is saved, though I suggest emailing yourself a copy in order to archive it accordingly. There are also other options available for editing and/or changing the file format.

Evernote-Logo2)  Evernote—an excellent, free app that is a favorite of mine—it does much more than provide audio feedback, but I’ll focus on using it for audio feedback in this post. One of the educators I follow on Twitter, a professor, introduced me to Evernote, in a Tweet where she explained how she discovered using the app to record a feedback for students that could be sent via email. Brilliant! The prof wrote about Evernote on her blog here.

I’ve also included screen shots of how to record a note in three easy steps.

Three Steps to Audio Feedback with Evernote:

Screen Shot
Step 1: After creating a new note for a student (sample student here is Nina),  click the mic icon, as highlighted here

Step Two:

Screen Shot 2013-04-06 at 11.25.40 AM
Step 2: Once you click the mic button you can record, then click ‘save’. You can also pause during the recording process and resume again.

Step Three:

Screen Shot 2013-04-06 at 11.26.16 AM
Step 3: This is what the message will look like after the voice recording is saved, then click on the arrow in top right hand corner.

Screen Shot 2013-04-06 at 11.26.28 AMAfter clicking the arrow, there are choices (see image); in this case email note is the method. Note the other options available that can be used for alternative instructional methods, i.e. sending a recorded message [reminder or announcement] to the class via Twitter by using the class #hashtag (if you have one).

Audio feedback is an excellent method to connect with students and provide feedback that is both constructive and meaningful, and can promote intellectual development and critical thinking. For those readers that are instructors, I do suggest giving this method a try.  You’ll see how easy it is, and how much students appreciate it.

If you have methods that have worked for you, or comments on audio feedback that might benefit others, please share!


Explanation Feedback Is Better Than Correct Answer Feedback for Promoting Transfer of Learning.  Butler, Andrew C.; Godbole, Namrata; Marsh, Elizabeth J.
Journal of Educational Psychology, Dec 17 , 2012, No Pagination Specified. doi: 10.1037/a0031026

What Students really think about Online Learning

What do students really think about online learning—do they love it? Hate it? Numerous reports and articles about online learning provide data on enrollment rates, perceived learning and more. But it’s the unedited, raw comments of actual students, as I wrote about in my last post that is invaluable to course instructors, designers and online educators. I’ll share in this post student comments and suggestions for supporting students in light of their feedback.

Below is a collection of [select] student feedback from anonymous feedback submissions from online surveys given at the end of fully online courses for credit. Responses are unedited. It’s telling of how students really feel. Feedback is grouped into four categories: 1) interaction/learning community  2) technical  3) course design/structure  and 4) learning environment.

Overview of Program
Below you’ll find the most representative responses from 115  student feedback forms from a possible 236 students (49% response rate). Also note, feedback is from a small program—15 courses, general education, for credit, with video lectures as the main content delivery method. LMS platform is Moodle. Responses below to open-ended questions, either ‘what did you like ‘least’ about the course, and ‘what did you like best about the course’.

Learning Environment

  • “..the online environment, definitely will be taking courses in person for [next] semesters” [response to ‘what you like least…]
  • The online environment, I tried it but I will most definitely be taking courses in person from now on. I have found that I struggle with time management and would benefit from scheduled class meetings.”
  • “it was a little rough at the beginning to understand the instructions. A little more clarity at the beginning would be helpful.”
  • “I would benefit a lot more from scheduled class meetings, personally struggled with time management.”
  • “I liked the course because it was easier to complete all of the work. I was able to do everything on my own time with a deadline at the end of the week. I also liked the weekly discussion boards.”
  • “that i could take it at my own pace”
  • “I really appreciated this course being offered in the online format. It allowed me to fit it into my schedule easily…”

Interaction/learning community

  • “I hate the class discussions, but I understand its to make it more interactive but I wish they weren’t there. They could just be assignments.”
  • “I loved the personal classroom feel of the videos and the message board posting assignments.”
  • “The lack of ability to interact.”
  • ..Well it is online. So I wish I had more live interaction with students even if was view a Skype class time.”
  • “It was an efficient alternative to taking the class physically.”
  • The group project. I chose this course because I don’t have time, working, school, and running a household to work with other people. I work best alone and was not able to participate much.”

Course Design/Structure

  • “I would want the flow of the course to be more smooth and logically structured rather than separated by blocks and topics that must be covered. Also, I wish some questions on quizzes were more clear and not misleading.”
  • “Some of the lectures were long, but that’s just me and my attention span.”

Technical Issues

  • “I wish I could have been able to download the videos so I could watch more of them in the time I had. It would have been nice if I could put them on the mobile device.”
  • “I enjoyed watching the videos at my convenience but I wish I could have downloaded them since the local weather often interfered with my ability to stream the videos.”
  • “…please make more of the videos downloadable for mobile devices like Kindle fire, androids, iPad, etc.”
  • “….online program is good, but honestly and with all respect, not great. I was not able to get any mobile application (iPad) to work ….that is a major setback in that many online students take courses because of the flexibility it offers”


  • Students are demanding mobile learning options.
  • Time management appears to be a factor for those struggling with online learning.
  • Numerous students mention the flexibility of online learning as a positive factor.
  • Majority of students want interaction and personal connection.
  • Effective course design is needed for clarity of instructions, and ease of navigation within the course environment (being able to find resources, instructions easily).

Related Post:
Click here for previous post, How unfavorable student feedback improves online courses, which provides tips and resources for creating student feedback surveys.

How [unfavorable] student feedback improves Online Courses

I look forward to the end of each course session when I get to analyze student comments and criticisms – which I do with great zeal. No doubt like anyone else, I enjoy reading the positive responses which can be uplifting and encouraging. But, it’s the constructive feedback I dissect to determine how we can improve our courses. I’ve learned that a key factor in gathering ‘good’ feedback is to develop a ‘good’ feedback form – one that is customized to the online experience. And, the questionnaire needs to be modified on a consistent basis as courses evolve and change – online courses are not static, nor should the questionnaire be. In this post I’ll share questions we’ve used to solicit ‘good’ feedback, and how it has helped improve our courses, and I’ll close with a list of resources for creating effective feedback forms in online learning environments.

Mid-course vs end of course Evaluations
There are two types of feedback that are most commonly used for gathering student feedback, the mid-course tune-up, with is formative in nature, and allows the instructor to make adjustments to improve the remaining weeks of the course, and the end of course feedback [summative] which is the focus for this post. However, mid-course feedback can be most helpful for instructors, as it provides opportunity to ‘tweak’ or adjust either their interaction with students and/or make minor adjustments to the course design.

Key features of mid-course evaluations is that they are delivered midway through the course, are brief [no more than 4 questions] use open-ended questions, and are anonymous.

We’ve yet to implement mid-course evaluations, though when I was a graduate student I completed mid-course ‘tune-ups’ at the half way point within a given course. Questionnaires were anonymous, and brief consisting of three open-ended questions:

  1. What is working well for you in the course?
  2. What could improve this course and make learning more effective [reasonable and feasible for the current course]?
  3.  Do you have any other suggestions for this current or future course?

Stanford University’s Center for Teaching and Learning has some good resources for midterm student feedback.

Purpose of Student Feedbackto improve Online Learning
Given that online instruction is somewhat new to our institution, and to our course instructors, getting [anonymous] feedback from students, the end-user, is most helpful. Using a feedback form designed for a face-to-face class did not work (this is what our program started with), hence

… we include questions to solicit feedback about factors unique to the online experience is what we focused on: clarity of instructions, student perception of instructor involvement and presence, access to resources and help when needed, peer interaction…

The ‘best’ questions that get the worst feedback…
We want feedback that is critical, yet helpful and useful to improving our course design, course quality and students’ learning experience. Currently we are working with a student feedback form that consists of 18 questions, 15 mostly quantitative in nature, using Likert Scales and other nominal rating method, and 3 open-ended questions. Below are a selection of the questions that have proved most helpful.

Questions about Course Design – using a 5 point Likert Scale

  • The coursework that I needed to complete for each module was easy to locate and clearly described. (‘1’ not at all easy and ‘5’ very easy to find).
  • Rate the ‘user friendliness’ of the course home page. (‘1’ is not at all user friendly and ‘5’ very user friendly).
  • The quizzes, exams, tests, and/or assignments were fair assessments of the learning objectives described for the course. (Likert scale)
  • The video lectures (audio and/or video) were effective in communicating the course concepts and content. (‘1’ not at all effective and ‘5’ very effective).
  • I spent the following number of hours on average each week on the course activities, including reading, working on assignments, watching lectures and taking exams/quizzes. (range of hours i.e. 10 to 12, etc. – 4 options)

Perceived learning by Student/Academic Rigor / Instruction

  • How much did you learn in this course? [choice of four responses]
  • I was challenged to think critically in this course. (‘1’ is not at all challenged and ‘5’ very challenged).
  • The course instructor’s involvement in the course through class discussion forums, individual communication or professor news board posts was… (Likert scale)
  • The instructor provided meaningful and timely feedback on assignments.

Open Ended Questions
It is the open ended questions which has proven to provide the most valuable feedback. The student comments have provided insight into how they perceive online learning, the challenges they face, why they like it, or don’t like it. Several students have provided suggestions that made us question, ‘why didn’t we think of that‘. Three simple, straightforward questions:

  • What did you best about the course?
  • What did you like least about the course?
  • Please provide any comments or suggestions you think might improve ____ ‘s course and/or the online program?

What we’ve changed as a result of Feedback
Each session we’ve modified, added and made changes to our program. Highlights include:

  • Added a module schedule – in response to student requests for having all due dates in one place: one web page within the course home page
  • Created student orientation activities which include a questionnaire covering key information students need to know to navigate/complete the course
  • Reworded and clarified instructions for assignments, weekly activities
  • Added rubrics – students asked for clearer grading guidelines, expectations
  • Worked with faculty to create enhanced questions for discussion forums to stimulate students to use of higher order thinking skills
  • Faculty adding weekly updates to professor news board on course home page  in response to student wanting increased professor involvement
  • We are currently working on a time management self-study program for students – numerous student comments mentioned challenges with managing their time

Each session we find something new to improve, and our decision to focus on any given area depends upon several factors: number of student comments, faculty, technological advances [we just added mobile version of our lecture videos and want feedback from students] and feasibility.  It’s an ongoing process, dynamic, just as learning is, we’re trying to keep up.

To create your own survey, Survey Monkey

Sample student questionnaire from University of Wisconsin, click here

Helpful page on Vanderbilt’s site: sample questions, how to analyze feedback, improve courses and more, click here

More about Likert Scales, click here