I look forward to the end of each course session when I get to analyze student comments and criticisms – which I do with great zeal. No doubt like anyone else, I enjoy reading the positive responses which can be uplifting and encouraging. But, it’s the constructive feedback I dissect to determine how we can improve our courses. I’ve learned that a key factor in gathering ‘good’ feedback is to develop a ‘good’ feedback form – one that is customized to the online experience. And, the questionnaire needs to be modified on a consistent basis as courses evolve and change – online courses are not static, nor should the questionnaire be. In this post I’ll share questions we’ve used to solicit ‘good’ feedback, and how it has helped improve our courses, and I’ll close with a list of resources for creating effective feedback forms in online learning environments.
Mid-course vs end of course Evaluations
There are two types of feedback that are most commonly used for gathering student feedback, the mid-course tune-up, with is formative in nature, and allows the instructor to make adjustments to improve the remaining weeks of the course, and the end of course feedback [summative] which is the focus for this post. However, mid-course feedback can be most helpful for instructors, as it provides opportunity to ‘tweak’ or adjust either their interaction with students and/or make minor adjustments to the course design.
Key features of mid-course evaluations is that they are delivered midway through the course, are brief [no more than 4 questions] use open-ended questions, and are anonymous.
We’ve yet to implement mid-course evaluations, though when I was a graduate student I completed mid-course ‘tune-ups’ at the half way point within a given course. Questionnaires were anonymous, and brief consisting of three open-ended questions:
- What is working well for you in the course?
- What could improve this course and make learning more effective [reasonable and feasible for the current course]?
- Do you have any other suggestions for this current or future course?
Stanford University’s Center for Teaching and Learning has some good resources for midterm student feedback.
Purpose of Student Feedback – to improve Online Learning
Given that online instruction is somewhat new to our institution, and to our course instructors, getting [anonymous] feedback from students, the end-user, is most helpful. Using a feedback form designed for a face-to-face class did not work (this is what our program started with), hence
… we include questions to solicit feedback about factors unique to the online experience is what we focused on: clarity of instructions, student perception of instructor involvement and presence, access to resources and help when needed, peer interaction…
The ‘best’ questions that get the worst feedback…
We want feedback that is critical, yet helpful and useful to improving our course design, course quality and students’ learning experience. Currently we are working with a student feedback form that consists of 18 questions, 15 mostly quantitative in nature, using Likert Scales and other nominal rating method, and 3 open-ended questions. Below are a selection of the questions that have proved most helpful.
Questions about Course Design - using a 5 point Likert Scale
- The coursework that I needed to complete for each module was easy to locate and clearly described. (’1′ not at all easy and ’5′ very easy to find).
- Rate the ‘user friendliness’ of the course home page. (’1′ is not at all user friendly and ’5′ very user friendly).
- The quizzes, exams, tests, and/or assignments were fair assessments of the learning objectives described for the course. (Likert scale)
- The video lectures (audio and/or video) were effective in communicating the course concepts and content. (’1′ not at all effective and ’5′ very effective).
- I spent the following number of hours on average each week on the course activities, including reading, working on assignments, watching lectures and taking exams/quizzes. (range of hours i.e. 10 to 12, etc. – 4 options)
Perceived learning by Student/Academic Rigor / Instruction
- How much did you learn in this course? [choice of four responses]
- I was challenged to think critically in this course. (’1′ is not at all challenged and ’5′ very challenged).
- The course instructor’s involvement in the course through class discussion forums, individual communication or professor news board posts was… (Likert scale)
- The instructor provided meaningful and timely feedback on assignments.
Open Ended Questions -
It is the open ended questions which has proven to provide the most valuable feedback. The student comments have provided insight into how they perceive online learning, the challenges they face, why they like it, or don’t like it. Several students have provided suggestions that made us question, ‘why didn’t we think of that‘. Three simple, straightforward questions:
- What did you best about the course?
- What did you like least about the course?
- Please provide any comments or suggestions you think might improve ____ ‘s course and/or the online program?
What we’ve changed as a result of Feedback
Each session we’ve modified, added and made changes to our program. Highlights include:
- Added a module schedule – in response to student requests for having all due dates in one place: one web page within the course home page
- Created student orientation activities which include a questionnaire covering key information students need to know to navigate/complete the course
- Reworded and clarified instructions for assignments, weekly activities
- Added rubrics – students asked for clearer grading guidelines, expectations
- Worked with faculty to create enhanced questions for discussion forums to stimulate students to use of higher order thinking skills
- Faculty adding weekly updates to professor news board on course home page in response to student wanting increased professor involvement
- We are currently working on a time management self-study program for students – numerous student comments mentioned challenges with managing their time
Each session we find something new to improve, and our decision to focus on any given area depends upon several factors: number of student comments, faculty, technological advances [we just added mobile version of our lecture videos and want feedback from students] and feasibility. It’s an ongoing process, dynamic, just as learning is, we’re trying to keep up.
To create your own survey, Survey Monkey
Sample student questionnaire from University of Wisconsin, click here
Helpful page on Vanderbilt’s site: sample questions, how to analyze feedback, improve courses and more, click here
More about Likert Scales, click here