Tag Archives: MOOC pedagogy

How (Not) to Design a MOOC: Course Design Scenarios From Four xMOOCs

designThis post examines four MOOCs completed as a student then de-briefed from a course design perspective—I share insights into what worked and what didn’t for the purpose of helping educators create better online learning experiences.

I recently completed two MOOCs on the edX platform that are part of a mini-series on education policy. The courses are great examples of how higher education institutions misuse the MOOC format by using traditional teaching methods that end up falling flat. I debrief the two MOOCs from a course design perspective and share why they were sub par, uninspiring. I also describe two other MOOCs that provided exemplary learning experiences. The two pairs of MOOCs provide instructive examples of contrasting course design approaches.

This post follows “How to Make Bad Discussion Questions Better: Using a Case Study of an edX MOOC” the first MOOC of the mini-series “Saving Schools: History, Politics, and Policy in U.S. Education”. I used actual discussion questions from this MOOC’s forums as examples of how not to write questions to foster student discussion. I rewrote the questions, providing better and best formats that would be more likely to encourage meaningful dialogue.

The second edX MOOC, “Saving Schools: History, Politics and Policy in U.S. Education: Teacher Policy” wrapped up this week (December 4). Both MOOCs followed an identical course structure that included: recorded video lectures that relied on the interview format featuring one (sometimes two) faculty member(s), two assigned readings per week (from the same source), one discussion question each week, and a final exam. This format is typical of xMOOCs; one that tries to mimic the in-class experience.

Screen Shot 2014-12-06 at 11.36.00 AM
Click to enlarge. Screen shot of instructions for the final assignment, a digital artifact, in E-learning and Digital Cultures. At the end of this post my Digital Artifact created for the course assignment

Exemplary MOOCs
The other two MOOCs used a non-traditional design approach. They took advantage of what the MOOC format could offer by acknowledging its uniqueness and providing content from a variety of sources outside the MOOC platform. They also utilized a range of assessment methods, and included social media that encouraged interaction. Both MOOCs, Introduction to Sociology and E-learning and Digital Cultures (from Coursera), inspired and promoted thought. The learner was a viewed as a contributor, not a recipient.

Screen Shot 2014-12-06 at 11.54.46 AM

Introduction video of Professor Duneier introducing his course on Coursera (2012). Duneier pulled the course from Coursera after concerns over licensing his course for other institutions use.

E-learning and Digital Cultures featured YouTube videos not lecture videos to demonstrate course concepts, along with articles, mostly from academic journals. The learning experience closely resembled a cMOOC experience (the original MOOC format developed by Downes and Siemens)—one that leverages sources on the web, shares student blogs and views students as contributors. Introduction to Sociology featured two video formats; one featuring Professor Duneier, not lecturing, but sitting in an armchair (above) talking, sharing course-related experiences. He acknowledged learners (some by name) and encouraged student interactivity. The other was live (and recorded) using Google’s Hangout platform with eight students and Duneier leading a seminar discussion.

Course Design Shortcomings of the edX MOOCs
The purpose of the following discussion about the edX MOOCs is not to criticize the course designers or faculty, but to consider the MOOCs as learning opportunities. Doing so aligns with one of the goals of edX, to use the platform to advance teaching and learning.

Learning/instructional methods: The MOOCs relied upon mostly traditional methods of instruction—lectures, multiple choice assessments. Content was instructor-centered, limited to lectures (featuring faculty member), textbook readings (from a book written by same faculty member), and articles from one source, Education Next, of which the same faculty member is editor-in-chief.

  • The edX MOOCs would benefit from inclusion of open resources, with links to outside sources showing various perspectives as well as social media platforms where students could engage live with content experts or static content. Also to share content sources, and/or their own content creations (blog posts, etc.)
  • Learning was confined to a virtually, walled classroom—inside the MOOC platform.
Target objectives

MOOCs that provide a focus and structure for students by including goals or focus questions, allow students to shape and customize their own learning accordingly

Course Objectives: There were no learning goals outlined for the MOOCs. There didn’t appear to be a focus for each week, or guiding questions to provide structure. Granted, learners should create their own learning objectives when working within a MOOC, though a stated focus or general goals for the course allows learners to establish and shape their own learning goals. E-learning and Digital Cultures provided an overview of the course which outlined the focus for each unit of study, and each week included focus questions to consider. 

Rigor: Course rigor was low. Disappointing given the institution behind the MOOC was Harvard. It’s worth noting at edX’s launch in 2012, the Provost of MIT at the time L. Rafael Reif emphasized the rigor and quality of courses on ex’s platform ”(edX courses need) not to be considered MIT Lite or Harvard Lite. It’s the same content” (MIT News).  Yet the discussion questions as outlined in my first post, the biased readings, lectures, the application activities for students did not add up to a rigorous learning experience that encouraged critical thinking. Several factors may have contributed. Suffice to say that the course design team would have benefited from someone with a high-level of expertise in effective course design principles, knowledge of learning theories and instructional methods.

Content: As mentioned the majority of the content was limited to the faculty member in the lectures, two or three chapters of a book authored by the same faculty member, and essays from the one source.

  • Biased resources did not contribute to learner’s considering multiple perspectives. Though in the second MOOC there was an effort by course facilitators to incorporate other perspectives in the discussion forums.
  • Lecture videos were long — typically 12 to 15 minutes. Research on MOOC videos suggest ideal length is 4 to 6 minutes (Guo, 2013).
  • Repetitive Content. Content from the readings were also included in the lecture, and frequently two interviews in the same lecture covered the same content.
  • Delivery methods of content were repetitive, uninspiring.
  • Content came across as telling, not interactive.

Application activities: There were few activities for learners to engage in except for discussion forums. Unfortunately the questions in the first MOOC did not encourage robust discussion, though they improved in the second course. There were two or three multiple choice questions after each video. Several questions could be considered common knowledge. I could have answered the majority of them without watching the videos.

Screen Shot 2014-09-30 at 2.20.48 PM

Screen shot of a forum discussion question from the MOOC “Saving Schools: History, Politics and Policy in U.S. Education”. A close-ended question, and one not likely to stimulate thoughtful discussion. In my previous post, “How to Make Bad Discussion Questions Better” I provide examples for more effective questions formats.

Conclusion
The pairs of MOOCs illustrate how varied approaches to MOOC course design significantly impacts engagement levels, perceptions and learning outcomes. The edX MOOCs examined here, typical of the majority of MOOCs, relied upon learning methods that failed to leverage the benefits of an open platform, failed to view as students as knowledge sources and contributors. Over time the MOOC format will no doubt settle into something quite different from what we’re experiencing now. A format that will find it’s purpose, engage learners and build bodies of knowledge that benefit all.

Further Reading:

MOOC Design Tips: Maximizing the Value of Video Lectures

“Which kinds of videos lead to the best student learning outcomes in a MOOC?”
How Video Production Affects Student Engagement: An Empirical Study of MOOC Videos (Guo, Kim & Rubin, 2014)

Reel of FilmAn excellent question that design teams and instructors of MOOCs want to know—which kinds of videos lead to the best student learning outcomes in a MOOC?  According to a recent study conducted by researchers for the edX MOOC platform, this was the most pressing question posed by the course design teams working with its partner institutions. Given that most MOOCs offered through higher education institutions platforms such as edX, iVersity, or Coursera use video lectures as the primary content delivery source, it is a critical question that preoccupies many if not most MOOC instructional design teams. Adding to this need-to-know element is the fact that video production is most often the highest cost associated with MOOC production. MOOC video production can range from a few hundred dollars and run up to the thousands. This post suggests how institution can use resources effectively in the video production process with the primary goal of supporting students’ learning outcomes.

The report released by edX last week gives design teams some concrete data to examine. I’ve emphasized below the recommendations and practical application points from the paper for readers who might be part of a design team for MOOC, online course, or for those with an interest in video production for instructional videos.  There are limitations to the study outlined in the paper, though the depth of the analysis does provide data worthy of consideration. 

The report, the first of its kind according to the authors Guo, Kim & Rubin, analyzes students’ engagement* with lecture videos gathered from data extracted from over 6.9 million video watching sessions across four edX courses.  *Student engagement is defined in the study by:

  1. Engagement time: the length of time that a student spends of a video. This is the same metric used by YouTube. Though researchers acknowledged the limitation of engagement assessed from this one-dimensional perspective.
  2. Question/Problem Attempt:  Almost one-third of the videos across the four courses featured an assessment problem directly following the video, usually a multiple-choice question designed to check a student’s understanding of the video’s contents. “We record whether a student attempted the follow-up problem within 30 minutes after watching a video.”

Videos Types for MOOCs
Lectures are divided into two primary types for the study, [which mirrors most MOOCs]: 1) lecture videos for content delivery—presented via an instructor/professor (‘talking head’ is the term used in the paper), and 2) a tutorial/demonstration, a step-by-step problem solving walk-through, common in computer science courses, courses featuring mathematical concepts or science courses featuring lab demonstrations.

Video Production Format
For analysis purposes, researchers coded the videos examined in study using six primary video production formats, which I’ve summarized below, along with production styles not mentioned in the study.

1) Lecture-Style Video Formats:

  • Instructor(s) with/without Presentation Slides: Features instructor(s) lecturing, with or without PowerPoint slide presentation slides inserted throughout with instructor ‘voice over’ while slide is displayed
  • Office Setting: close-up shots of the instructor filmed at his or her office, typically instructor speaks directly to camera
  • Classroom Setting: video captured from a live classroom lecture
  • Production Studio Setting: instructor recorded in a studio with no audience, typically speaking to the camera

2) Tutorial/Demonstration Video Formats:

  • Video Screencast: of the instructor demonstrating a concept, i.e. writing code in a text editor, or command-line prompt (in the case of computer science courses), using spreadsheet or document
  • Instructor Drawing Freehand on a Digital Tablet, using a software program, which is a style popularized by Khan Academy videos (click here to view an example)

Other Formats not mentioned in the study:

  • Instructor interviewing another expert or guest speaker
  • Instructor delivering lecture in another setting related to the course (though not always), for example an ecologist giving lecture at the beach, an art historian in a museum, etc.
  • Panel Discussion of experts on specific course-related topic

Which format to use? The primary factor that determines which format to use are the objectives of the MOOC or course, and the course content. The course design team typically selects the video formats during the course design phase when the instructional strategy is created, for example: the formats of the video are chosen, the content chosen for each, related student activities or assessments selected, etc.

The second factor determining which format to employ is the amount of resources (dollars) available for video production. This determines right off the bat which tool, program or hardware will be used for the video production. Important to note, the amount of resources invested in video production does not scale to how much students’ learn or to MOOC completion rates. For example, I completed a course on Canvas Network, Statistics in Education for Mere Mortals (my course review here). The course featured video lectures and tutorials, all created by the instructor using low-budget technology. Lectures appeared to be filmed on the instructor’s laptop using a web cam, (power point slides were added, so there was some editing). Each module featured a tutorial, a screen cast where the instructor demonstrated application of various formulas to a data set. I found the professor, Lloyd Rieber, encouraging and personable; he also delivered the content concisely in lecture videos and tutorials. Interestingly, the course completion rate was over 10%, higher than typical MOOC completion rates that are usually lower than 7%.

Key Findings of Study

  • Shorter videos are more engaging. Student engagement levels drop sharply after 6 minutes
  • Engagement patterns differ between the two video formats; engagement higher with the lecture style videos (‘talking head’) which researchers suggest is due to more “intimate and personal feel”
  • Several MOOC instructors interviewed for study felt more comfortable with the classroom lecture format, however this format did not translate well online, even with much editing in production studio
  • For tutorial/demonstrations videos, the Khan-style format where instructor draws on tablet and narrates, was found to engage students more effectively than screen casts. A contributing factor—instructors ability to situate themselves “on the same level” as student
  • Video producers and edX design teams determined that pre-production planning had the largest impact on the engagement effect of the videos. Researchers used a data set within the study to test this idea

Practical Recommendations for Course Design Teams

  1. Identify type and format for each video lecture using course objectives and module breakdown as a guide, and budget. Plan each lecture for the MOOC format and its potential students. Consider copyright terms for images used in videos and slides. Plan ahead by selecting appropriate images, free from copyright during the planning phase
  2. Invest in pre-production planning phase. Segment course content into chunks, using six-minutes per video as a guideline. Identify purpose for each video lecture, and key content points to deliver within each.  Write script for each [lecture video format] and have instructor practice before filming—reduces filming and editing time

  3. For tutorial/demonstration videos introduce motion and continuous visual flow into tutorials, along with extemporaneous speaking so that students can follow along with the instructor’s thought process. Complete basic outline of video beforehand, not full script to be read word-for-word
  4. Provide more personal feel to videos. Try filming in an informal setting (such as the instructor’s  office) where he or she can make good eye contactit often costs less and might be more effective than a professional studio. Coach instructors to use humour, personal stories and convey enthusiasm where possible

Closing Thoughts
MOOCs are here to stay, which makes studies like this one valuable for helping educators be more effective through course design. This study brings us closer to finding the answer to the question which kinds of videos lead to the best student learning outcomes in a MOOC?  Yet it’s a start, there is still much more to be done in understanding how students learn in massive courses, and how institutions can be more effective with investment of its resources for increasing student learning outcomes.

Further Reading:

MOOC ‘Jam’: Highlights from a Jam on Digital Pedagogy

This post includes takeaways from a ‘MOOC Jam’, a synchronous discussion online I participated in with a group of educators about digital pedagogy. 

MOOC_Jam_Image

‘Jam’ by John Wardell (Flickr)

“What is a Jam?  A Jam is an asynchronous, typed, online discussion designed to work around your schedule. The goal of a Jam is to gain perspective and solicit ideas that inform the community. After the Jam is over, you can read the exchange and the posted resources, which will remain available for several weeks.” MOOC Jam II, Digital Pedagogy   (The three threads of asynchronous discussions in this jam are: 1) Competencies for teaching online, 2) Developing Faculty Competencies, and 3) Learner Analytics for Faculty)

Screen Shot 2014-04-01 at 11.16.09 AM

Screen Shot of participants online during the Jam participating in the threaded discussions (only partial shot of participants)

This past Tuesday, I participated in a  MOOC Pedagogy Jam via the website Momentum, a platform created for stakeholders to discuss critical issues related to education, sponsored by the Bill and Melinda Gates Foundation. The purpose of the platform is to provide a space to host online events about topics related to online education, with the ultimate goal of the Jams ‘to gain perspective and solicit ideas that inform the community’.  I participated in the first MOOC Jam this past November. The topic, “Peer Review of a Framework for MOOCs” hosted by George Siemens, focused on the design of the MOOC Framework. Siemens, creator of the Framework, sought input from the community of participants.

The topic of this Jam, [which turned out to be more of a synchronous discussion] was digital pedagogy, divided into the three threaded discussions as mentioned above. Each discussion featured a moderator, responsible for responding to participants and furthering the discussion, and another moderator summarizing key themes of the discussion each hour. I chose to participate in ‘Competencies for teaching online: describing effective pedagogy’ given its description— “An exchange on how information is delivered to students, how they are engaged as active learners and community is built and how learning is assessed”.

Screen Shot 2014-04-03 at 3.55.00 PM

Screen shot of one the three threaded discussions of the Jam held on the Momentum platform

Digital Pedagogy: Themes and Highlights
Following are my insights from the discussion on digital pedagogy and I’ve included comments from other participants. (Jam II, Momentum, Digital Pedagogy).

The discussion was rich with ideas, insights and provided a glimpse into the issues and challenges with online instruction. Though the title of the Jam featured ‘MOOCs’, much input from contributors pertained to closed, online courses which created an interesting discussion by highlighting one of the primary challenges in online education—the application of appropriate pedagogical methods, which will vary depending upon the learners, the delivery method and goals of the course.

Themes:

1)  Part of the discussion was devoted to the contrast and challenges between learner-directed and instructor-directed learning. The fact that much discussion focused on this issue highlights one of the challenges with MOOCs; a MOOC, due to its scale and format lends itself to be learner-directed. It’s not surprising then that MOOCs attract learners that already know how to learn, are motivated and educated. Several Jam participants discussed methods to get learners involved in learning, how to encourage students to engage and participate [typically in the context of closed online classes].

“I’ve done something similar to engage students in action research with me. I was teaching Web Development and was not happy with the development framework we were using. So as a class we researched the pros and cons of various frameworks and decided as a class (with my approval) which one to use. This worked well – they had “buy in” as we used to say.  Beyond that, I set basic specifications as to what they were to include in their work product, but allowed them to choose the subject matter (content). I also had to give approval before they began coding.”

I see the above challenge highlighting two opportunities: 1) to provide support to students to learn how to be self-directed, and 2) to provide skill development for educators and course designers in how to be flexible and adapt instructional strategies by assessing learners, the learning context and creating appropriate learning experiences, implementing pedagogical methods that match the learning needs.

One Jam participant shared an initiative that his institution recently started for its students; a program designed to address much of what was discussed here.

“California State University, Monterey Bay, is creating an online training module for training in baseline skills in web technologies for collaboration and other soft skills, such as team working relationships. Selecting appropriate pedagogy again depends upon an analysis of the learners — goes back to careful and thrustful planning”

2)  Considerable discussion focused on how to get students to interact, collaborate and engage with peers in online classes, and what the instructors role is in facilitating group formation, participation and learner engagement. Though this theme is similar to the theme mentioned above, interesting thoughts on group formation and collaboration emerged—should it be encouraged, facilitated or left for students to form spontaneously? And if so, how? This relates to the motivation of the learner, which is quite different when students are in for-credit classes versus ‘free’ and open classes  [MOOCs] that are driven by interest and desire to learn—essentially self-directed.

But a key trade-off when you have non-static groups, as Michaelsen, Fink et al have looked at is that you lose the crucial accountability factor and or the time to form constructive group norms/roles etc. — this then leads to the ‘freeloaders’ issue that gives groupwork such a bad rep.  There is the challenge — in a MOOC context, can you establish stable, productive learning groups with accountability, positive norms, roles etc to really activate the engagement, peer learning and other benefits of group learning?”   

The comment above is interesting—is it really possible or desirable in a MOOC environment that the responsibility for group accountability and productivity rests with the instructor?

3)  The session wrapped up with discussion that focused on supporting learners, helping learners to learn in a MOOC format.  The question appears to be—how can this be accomplished, is it through course design, or while the course is live, accomplished via course facilitators?  Or do we need to teach students how to learn in a MOOC?

I think one of the goals for a MOOC is enabling learners to make connections, share, collaborate and learn from one another. Rather than thinking about self-directed or facilitator directed maybe we need to think about how we can create ways that encourage learners to support one another?”

I am very interested in how learners can and do support one another’s learning in MOOCs. Do you have some thoughts in mind about the answer to this question? What can we build into the design that supports and encourages peer-peer learning?”

Closing Thoughts
Discussions, similar to those within this Jam, create excellent opportunities to get the issues and challenges facing education, specifically online education, out in the open.  It also helps stakeholders identify what needs to be discussed and explored within their own institutions. There are commonalities across all institutions when it comes to online education, and ironically the very barriers affecting these issues, exist within institutions at all levels. Fortunately there is progress—many institutions are experimenting, collaborating and striving to adapt to cultural shifts, increase access, yet still provide high quality, relevant education. Are there similar discussions happening within your institution?

A MOOC Quality Scorecard applied to Coursera Course

In this post I review a recently completed Coursera course using a quality scorecard approach to measure and quantify five key dimensions of the course.

checkmarkI’m in the final week of a Coursera MOOC, Sports & Society that for the most part has been lackluster and disappointing. I expected a university level course that would provide  learning and perspectives into the social and cultural dimensions that affect sports participation and perceptions across different cultures. It missed the mark. Granted, the majority of Coursera courses are ‘lite’ versions of the real thing—few mimic the workload or rigor of the face-to-face counterparts, which is fine [even preferred when taken for personal development] given the courses are not promoted as such.

Though a lite version does not mean that meaningful and deep learning cannot occur. I’ve completed two other Coursera courses, Introduction to Sociology, and E-learning and Digital Cultures. Both courses provided rich learning with scholarly materials, challenging assignments and opportunities to gain knowledge beyond the course site. Though the Sports & Society course wasn’t completely inadequate, in fact some things were done well, but for all the effort and resources that went into the course, it missed the mark quite significantly in terms of providing conditions for meaningful learning to occur. I see an opportunity to share here with readers what contributed to a mediocre MOOC learning experience. To provide an illustrative framework for this review, I’ve created a MOOC quality scorecard review, that is [loosely] based on a quality scorecard approach and my   course design experience.

Course Overview
I have a keen interest in the topic of Sports & Society, specifically youth and college sports and its effects on youth development – the emotional, physical, and educational dimensions. I won’t get into details here, but I write about this topic on another one of my blogs, school and sports, and research the topic frequently. Coursera’s Introduction to Sociology expanded my interest in the topic, which has led me to several excellent resources on Sports Sociology.

To put this course review into context, following is the course description for Sports & Society. I want to emphasize that they were no objectives, goals or purposes outlined for the course, which made defining the scope of the course a challenge, as well as determining my learning goals.

Course Description: “Sports play a giant role in contemporary society worldwide.  But few of us pause to think about the larger questions of money, politics, race, sex, culture, and commercialization that surround sports everywhere. This course draws on the tools of anthropology, sociology, history, and other disciplines to give you new perspectives on the games we watch and play. We will focus on both popular sports like soccer (or “football,” as anyone outside America calls it), basketball, and baseball, and lesser-known ones like mountain-climbing and fishing.”   

MOOC Quality Scorecard Review: Coursera, Sports & Society

Criterion Scoring
0 points = Not Observed
1 point = Insufficiently Observed
2 points = Moderate Use
3 points = Meets Criterion Completely

1)  Quality of Instructional Methods: Sports & Society:  Score: 1/3

Criteria for scoring
♦ Course includes objectives that provide direction for course ♦ instructor demonstrates expertise in subject area, presents topics that relate to course objectives in cohesive manner ♦ course environment encourages student to make connections/construct new knowledge and/or demonstrate knowledge ♦ instructor presents alternative viewpoints ♦ students encouraged to draw upon personal experience and encouraged to apply/reflect on course concepts.

My Comments
The course felt disjointed, topics were unrelated and did not appear to move towards an objective or goal. It was a passive learning experience, with little encouragement or opportunities to apply content, reflect or share content/resources. The course was instructor-focused, content was primarily delivered via professor lecturing. Instructor did not go into deep analysis of principles, apply theoretical constructs or provide different perspective on topics – for example, neglected to mention other  institution’s in society, and the role/influence each play in sports, i.e. governments, International Olympic Committee, NCAA (in the US for instance) etc. Another, the instructor presented a narrow point-of-view on the topic – Business of Sports, suggesting that big corporations are solely responsible for the commercialization of professional and college sports.

starn_logo

sports & society, Coursera

2) Quality, Depth & Breadth of Course Materials: Sports & Society: Score: 2/3

Criteria for scoring
♦ Variety of content sources with breadth of perspectives ♦ content goes beyond professor video lecture ♦ numerous links provided to open source content with variety/breadth ♦ students encouraged to contribute to course content and/or have access to venue within course site to share/discuss findings ♦ readings include scholarly sources with peer-reviewed papers via downloadable PDF format and/or open access resources ♦ current content included to provide relevant context/perspective for topic.

My Comments
Video lectures were informative, though primarily featured professor lecturing. There were some references made to the readings. Weekly readings varied, with average number of pages assigned between 15 and 30. It might have been helpful to have  optional readings each week, for students wanting to delve deeper into a topic (there were two or three).  Two of the weekly featured selected chapters from a non-fiction book—one from a book the professor authored. A link was provided for those interested in purchasing the book at a discounted price for Coursera students, which comes across as a tactic to sell the book, more so when it is the only reading for the given week. Only two or three readings were from scholarly sources/peer-reviewed journals (here is an example of readings from week 6, here, (a chapter from a non-fiction book)and  an optional reading here). Few materials encouraged deep analysis or described theoretical principles of topic.

3) Interaction: Student & Social Engagement: Sports & Society: Score: 2/3

Criteria for scoring
♦ Students are provided a venue within course site to interact with other students ♦ forums are monitored to ensure discussion is respectful, non-threatening and safe ♦ Twitter hashtag created for course ♦ students encouraged [not mandated] to engage with other students via platforms outside of course site ♦ instructor/ TA engages within forum discussions that are specific to course topics to promote higher order thinking ♦ live events [Google hangout/webinar] for students to watch/engage via real-time comments, i.e. professor with guest, groups of students discussing reading etc.

My Comments
Four Google Hangout events scheduled, though two were cancelled due to technical difficulties. The two that did work featured guest speakers, and select class members for discussion, and allowed for students to post live comments while watching. These were very good.  Forums had few students commenting and contributing, but this allowed for easy to follow discussions threads. Instructor did get involved in some discussion forums. A Twitter hashtag was not assigned to course to promote interaction or sharing. This is a missed opportunity, as Twitter is an excellent platform to engage students throughout the course, to share content, promote blog posts, encourage chats & connections, etc.

4) Activities & Assessments: Sports & Society: Score: 1/3

Criteria for scoring
♦ Learning activities prompt students to share their learning ♦ activities leverage international perspectives of student body ♦ instructions for assignment include descriptions of the purpose and rationale i.e. why learners are doing the assignment/activity ♦ assessments provide further opportunity to learn ♦ assessments encourage students to find information ♦ assessment(s) align with course outcomes [as per certificate].

My Comments
Little encouragement or opportunity to apply content, to reflect and share.  Assessments were five-question quizzes, two per week—one on assigned reading and the other lecture content. Neither assessments nor activities promoted higher order thinking skills, analysis, evaluation or critical thinking.

5) Interface of Course Site / Instructional Design: Sports & Society: Score: 3/3

Criteria for scoring
♦ ‘Start-here’ section [orientation] included for introduction to course and site itself ♦ course instructions and requirements for each week/module are detailed ♦ clear expectations for assignment completion, peer review process and/or quizzes/tests outlined and accessible ♦ clear guidelines for certificate requirements as applicable ♦ instructional materials are accessible and easy to use ♦ links to technical support.

Comments
Very good interface on Coursera site, easy to navigate. Includes a ‘start-here’ orientation page. Instructions and expectations concise and thorough. Schedule and due dates page was helpful to clarify dates.  Technical support available.

MOOC Scorecard for Sports & Society: Total  9/15

I’ll close with this: I appreciate that Cousera provided this course for free. I also appreciate the time that the professor put into this course; his work and effort, which no doubt was considerable. The score above is based upon my personal viewpoint. The purpose of this post is to provide insight into what could improve the course, and for other educators to learn from one student’s perspective.

Resources: