Three Actors that Contribute to Student Success in Online Courses: The Institution, Instructor and Student

This post examines three actors that are essential to student success in online courses: 1) the institution, 2) the instructor and, 3) the student.

Screen Shot 2014-08-19 at 10.47.37 AM

Actors Contributing to Student Success in Online Courses

What contributes to student success in a course delivered online? To consider the question from a different perspective one can pose the question this way—who is ultimately responsible when students are not successful—when they fail the course for instance? Is it the student for not having the discipline for online learning? The instructor for not providing support, or the institution for not providing services to support the online student? These are questions worthy of examining at a philosophical level, though in this post I examine select behaviours and strategies associated with the three actors involved in the process of students learning online, 1) the institution, 2) instructor and 3) the learner.

What Contributes to Student Success?
Before examining the three actors roles in the learning process it’s helpful to identify the factors contributing to student success in online environments including the skill set required. It’s also instructive to acknowledge that there is an underlying expectation that students enrolling in online courses are self-directed and capable of managing the tasks associated with online studies. Yet research and feedback from educators reveal something quite different; many students are unprepared to learn online, lack the basic skills, and are not capable of assuming responsibility for their learning. Online course work requires that students use a range of skills including accessing resources, people and content within a network, analytic and synthesis skills to distill relevant information from an abundance of information and resources (Kop, Fournier, & Mak). Though as mentioned, it’s not uncommon to find students lack some, if not many of these skills.

Not only are students often unprepared, but institutions often fail to prepare faculty and instructors for online facilitation. A starting point in boosting student success is identifying the behaviours associated with each of the three actors.

1) The Institution: Student Support Services via the Institution 
One characteristic of institutions offering successful online programs is their ability to support the unique needs of distance students through a student support services function.  As online programs evolve and mature we now have numerous programs to examine and study. Though each unique, there is a common theme—a focus on the students by acknowledging their diverse needs and challenges of studying online. Below are select examples.

Services for online students need to be customized, re-tooled from those provided to traditional students. Services should include technical support, academic advising, online community programs and clubs, library services and career planning.  Some institutions have gone further and developed programs that offer personalized academic support, SUNY Empire State College for example offers a peer tutor program. This program is unique, it’s not a subject matter coaching program, but a mentoring program where the goal is for tutors to help students identify and implement strategies that promote independence, active learning and motivation.

centss_badge_2nd_180w

“Creating College Success” from Rio Salado College,  an Award Winning Program

Rio Salado College developed an orientation program “Creating College Success”. It’s a one-credit course delivered fully online. The goal of Rio Salado’s program is similar to that Empire State’s—student self-sufficiency in academic environments.  Penn State World Campus, one of the first universities to deliver online degrees has a comprehensive roster of services for virtual students. One service that all institutions should consider is offering extended hours for technical and academic help via email, phone, or instant messaging.

Western Governors University is one that offers not only academic and technical support, but wellness services through its Well Connect program where students can call a toll-free number any time of day or night for support including personal counseling, legal and debt counseling, new parent transitioning support and more.

2) The Instructor:  Course Design and Instructor Support 
There are two areas that fall under the instructor support: 1) course design, and 2) instructional support.

Course design plays a significant role in students’ potential for learning online, given that students engage with course content, instructor and peers through the course platform. The way in which course content is presented on the course site, the instructions for assignments or activities are written, even the structure and order of the tabs on the course home page (course interface) have an effect on how the students engage with the course, will potentially affect students’ learning. Professor Robin Smith, author of “Conquering the Content: A Step-by-Step Guide to Online Course Design” (2008) describes course design this way:

Design features incorporated in [the] system course development and the learning guide, will create an environment in which students are confident of their pathway, and the only challenge is the course content, not the navigation of the course or figuring out what must be done in order to complete the course…this focus on course design, will free you [instructor] up to spend the semester teaching and interacting with students rather than answering questions about course navigation or specific directions about assignments.” 

The instructor’s role in online courses will vary depending upon the nature of the course, but more importantly instructor behaviours will be a function of the level of students educational background and students’ skill level in the areas mentioned above (collaboration skills, technical, etc). To assess what level students are at when entering the course, ideally the instructor does so through involvement in discussion forums, course introductions, synchronous activities, etc. that allow the instructor to get to know students. Instructors also can do so by reviewing student work early in the course so he or she can provide detailed feedback, challenge the student, suggest external writing support as needed, etc.

The goal is that the instructors focus on challenging students academically in the course via feedback and interaction; individually and as a class. Support for technical, research, or basic academic skills should be provided by the institution, via support services. Institutions should also offer professional development courses, workshops or resources to support online instructors and faculty in course development and instruction.

3) The Student:
The student is ultimately responsible for his or her success in the learning process; it is up to him or her to leverage the resources of the institution and the support of the instructor. There is an effective tool however, a leader readiness questionnaire, that many institutions make available on its website which identifies the skills and tools students will need to be successful with their online studies. Also the concept of giving the responsibility of learning to the students, is another method to encourage success—letting students know they are ultimately responsible.

Below are links to several learner readiness questionnaires provided by various institutions, one is licensed under the creative commons share alike license which makes it available for use to anyone.

In a follow-up post I review tools and resources available on the web that support the development of the skill-set students need for online learning. Readers may also find a previous post, Five-Step Strategy for Student Success with Online Learning helpful— it outlines behaviours associated with successful outcomes for online students.

Conclusion
Supporting student success in online course work begins with the institution—ideally with a strategic plan that includes a system for provision of administrative services, academic counseling, and support specific to online students, as well as professional development and comprehensive resources for faculty and instructors teaching online. Yet to maximize the value of the support offered by the institution and instructor, the learner needs to own the learning, and know the responsibility for success ultimately rests with him or her.

Resources:

How-to Make Group Work Collaborative In Online Courses: Four Strategies

“CL (collaborative learning) occurs when small groups of students help each other to learn. CL is sometimes misunderstood. It is not having students talk to each other, either face-to-face or in a computer conference, while they do their individual assignments. It is not having them do the task individually and then have those who finish first help those who have not yet finished. And it is certainly not having one or a few students do all the work, while the others append their names to the report (Klemm, W.R., 1994).” (Laal & Laal, 2012).

iStock_groupcollaborationXSmall

Group Collaboration

Providing interactive learning opportunities in online courses is frequently cited as a best practice by institutions offering distance education—Penn StateUniversity of Illinois and Grand Rapids Community College are three of many examples. Yet I know from experience on both sides, as a student and educator, the challenges of functioning within and facilitating collaborative learning activities—group work especially.  In theory, collaborative learning is a sound idea given the numerous studies that suggest the benefits of students learning from and with each other by sharing ideas and perspectives:

…Samuel Totten (1991) who claims that: The shared learning gives learners an opportunity to engage in discussion, take responsibility for their own learning, and thus become critical thinkers. (Laal & Laal, 2012)

And:

Palloff and Pratt (2005) suggest that online courses that are rich with student interactivity facilitate the development of critical thinking skills, better learning, socialized intelligence, and reflection.  (Zygouris-Coe, 2012)

Yet all too often students’ experiences in small virtual groups contrived for the purpose of creating group learning experiences, result in frustration and even resentment. It’s no wonder educators often question whether group work is worth the aggravation. Is student collaboration really necessary for learning? And if it is, how can it be successful?

This post aims to offer support and resources for readers looking for answers to these questions; I incorporate research from four recent papers on group work and collaboration in online learning environments specifically that shed light on the realities of contrived collaborative activities for students. One in particular, “Seven problems of online group learning (and their solutions)” (Roberts & McInnerney, 2007) provides practical and helpful suggestions for course designers developing group activities and for instructors facilitating group work. Another, “Collaborative learning: what is it?” (Laal & Laal, 2012) is particularly helpful and applicable to educators; it clarifies what collaborative learning looks like and describes in detail the required elements.

Group Work for Closed Courses not MOOCs
This post outlines essential conditions for group work in online learning environments and suggests four strategies that hone in on the key components needed to create collaborative activities specific to closed, online courses, not MOOCs. In my experience with Massive Open Online Courses, it is not possible, nor desirable for instructors to require or mandate class activities where students collaborate in small groups. Collaboration in MOOCs is ideally student-driven, in keeping with the pedagogy of massive courses. In small, closed and for-credit online courses, the pedagogical approach is different—it requires involvement of the instructor, and a more structured learning environment and activities that support specific learning objectives typically associated with for-credit courses.

Learning Theory and Demand Behind Group Work
Before discussing practical strategies, it’s worthy to examine how group work became an accepted practice in education. The idea that students need to work together to learn, stems from several learning theorists including Piaget, Dewey and Bruner. The premises of their theories are that learning is active, and knowledge is constructed through interaction with the environment (constructivism). Building on the constructivist premise is social learning, where learning happens through active engagement with others (Vgostsky). Yet the concept of students needing to work in groups to learn, is not the only driver of group work in online spaces. The other is the idea that students of today require a unique skill set to work, engage and collaborate as global (and digital) citizens. Businesses also demand that employees be team players, have excellent communication skills that includes working virtually in teams, as well as proficiency with digital platforms. Recently the Wall Street Journal featured an article about companies that seek employees who are able to collaborate with colleagues anywhere in the world, often without ever meeting in person (Rubenfire, 2014). These factors contribute to the perceived need to provide learning opportunities for online students that involve small groups.

Group Work: Cooperation versus Collaboration
Two concepts frequently used interchangeably when discussing group work is cooperation and collaboration. Though each concept is distinct; each suggests a different level of learning in practice. I suggest that both exist on a continuum of student interaction in online environments, with students ‘discussing’ a topic (in a forum for instance) on one end, and ‘collaborating’ where students work and learn as a team—creating for example, a final product interdependently that represents their knowledge construction, on the other.  In their paper, Laal & Laal define each:

  • Cooperation is a structure of interaction designed to facilitate the accomplishment of a specific end product or goal through people working together in groups;
  • Collaboration is a philosophy of interaction and personal lifestyle where individuals are responsible for their actions, including learning and respect the abilities and contributions of their peers. (2012, p. 494).

In most instances, group work in online courses is cooperative at best. Small group exchanges within online courses were examined and discussed in the paper “How much “group” is there in online group work” where students interactions were categorized as: 1) parallel, 2) associative and 3) cooperative interactions (Lowes, 2000, p. 4). Only one group of the five examined approached the higher level of cooperation. However, there are methods and strategies educators (and their institutions) can implement to move students along the continuum of group learning towards the collaborative. There are several conditions necessary for cooperative and collaborative learning identified in the literature referenced in this post—summarized below.

Required Conditions for Cooperative and/or Collaborative Learning in Closed Online Learning Environments

  • Dialogue amongst students is a fundamental component of the group activity; assignments should be designed to encourage discussion and brainstorming (asynchronous and synchronous) rather than a division of labour. One paper suggests that group assignments be constructed for “positive interdependence” where each group member contribution is unique and indispensable (Lowes, p. 12) though examples are not given
  • Understanding of the purpose of the activity—achieved by communicating to students why group work is necessary, e.g. sharing how the project aligns to the learning goals, how students will benefit
  • Access to digital platform(s) and tools that support online collaboration—for discussion, creation of final product, etc. e.g. Google Docs, Google Hangouts
  • Support for students unfamiliar with collaboration platform & tools
  • Guidelines that outline: student expectations, netiquette, procedure to deal absent group member(s), assessment methods, examples of collaborative exchanges between students, team roles, etc.
  • Instructor (and institution) efforts aimed at developing and supporting student skill set for cooperation, collaboration and working in teams
  • Instructor involvement to address non-contributing group members, group challenges, etc.
  • Inclusion of an assessment mechanism on two levels—group and individual

Four Strategies for Instructors (and Institutions) That Support Online Group Work

1. Design a Group Assignment that is complex, that challenges students to apply and discuss course content using multiple perspectives to solve a problem or develop a solution. Include expectations, purpose and clear instructions about how students can collaborate and provide feedback to each other. (Lowes, 2007, p. 12)

2. Model and support the development of collaborative skills • Develop collaborative learning protocols and establish clear expectations about student and instructor roles • Promote student self-monitoring of learning through progress reports, feedback, discussion forums, virtual student-instructor conferences  Cover the skills required at the beginning of the course… An extensive list of ideas in “Collaborative learning in an online teacher education course: lessons learned” (Coe, 2012, p. 339)

3. Facilitate and be involved in group activities.Closely monitor group discussion boards to identify student involvement at beginning of group work, contact students not participating early in the group process.  Collect ongoing data on student progress.

4.  Make the assessment criteria explicit. “Several effective solutions may be employed to do exactly as Webb suggests, that is, to measure group productivity and to measure the individual students’ abilities within the group. Exactly which of the solutions is the most
appropriate will depend upon the circumstances.” (Roberts & McInnerney, 2007, p. 263).

Closing
There is no formula for creating effective group learning opportunities in closed online courses, yet there are shared experiences from educators and academics that provide a starting point as outlined in this post. I encourage readers to share their own experiences, ideas and suggestions for facilitating group interactivity either here with other readers, on other social media platforms or with colleagues. What works and what doesn’t?

References:

Laal, M. & Laal, M. (2011). Collaborative learning: What is it? Social and Behavioral Sciences 31: 491 – 495. Retrieved from http://www.sciencedirect.com/science/article/pii/S1877042811030217

Lowes, S. (2014). How much “group” is there in online group work? Journal Of Asynchronous Learning Networks, 18(1). Retrieved from http://jaln.sloanconsortium.org/index.php/jaln/article/view/373/82

Roberts, T. S. & McInnerney, J. M. (2007). Seven problems of online group learning (and their solutions). Educational Technology and Society 10(4): 257-268. Retrieved from http://www.ifets.info/journals/10_4/22.pdf

Zygouris-Coe, V. (2012). Proceedings from ICITE 2012: Collaborative Learning in an Online Teacher Education Course: Lessons Learned. Rhodes, Greece. Retrieved from http://www.icicte.org/Proceedings2012/Papers/08-4-Zygouris-Coe.pdf

Three (BIG) Barriers to Student Participation in xMOOCs

This post outlines three barriers that can deter, discourage and/or intimidate students from participating in xMOOCs (MOOCs offered on platforms associated with higher education institutions, i.e. Coursera, iVersity, edX, etc).

construction_barriers

“Construction Barriers”  Photo by Lyn Topinka

The xMOOC model that emerged in 2012 has not changed much in 2014, with completion rates and participation rates just as low as they were when concrete data on completion rates appeared in 2013 (Parr, 2013). Though there are a variety of factors that contribute to low completion rates, I suggest that three barriers, 1) technology, 2) poor usability & course design, and 3) anonymity contribute significantly to low student participation levels and completion—barriers that deter, discourage and in some cases intimidate students. Also, in some instances, barriers one and two are potential barriers in closed, online classes (as those offered as for-credit courses at public and private institutions).

To illustrate points one, technology and two, poor usability and course design, below is a selection of screenshots featuring actual student comments and questions (names obscured) taken from several MOOCs offered on Coursera. Comments below are representative of typical experiences and frustrations of students participating in MOOCs. In some instances, the examples included are similar to frustrations students experience in closed online courses, which I’ve encountered when working with faculty in online course design, and as a lead curriculum developer for online programs at a private university. I close by discussing the third barrier, anonymity in online learning, specifically in MOOCs.

1) Technology 
Examples below feature student challenges with accessing course content and engaging in events due to bandwidth and internet access limitations.

Screen Shot 2014-07-11 at 8.22.26 AM

File size is a common problem in MOOCs and small courses

Screen Shot 2014-07-11 at 3.41.19 PM

Connectivity issues are common due to bandwidth, and even limitations of the devices used

Screen Shot 2014-07-11 at 8.22.50 AM

Restricted access to certain sites in some countries

Examples below feature students’ frustrations with applications (discussion forums, etc) within the MOOC platform itself which put up barriers to student participation and engagement, for example, i) discussion forums (volume of student posts and organization), ii) synchronous events offered via Google Hangout or other platform which often fail due to technical glitches, or because of students’ lack of technical ability, etc.

Screen Shot 2014-07-12 at 3.54.13 PM

Discussion forums often become unwieldy; though more common in MOOCs it also happens in closed, small online courses

Screen Shot 2014-07-12 at 3.53.21 PM

Discussion forums in most MOOC platforms have options for ‘subscribing’, where participants receive alerts of new posts within that particular forum, though not all students are familiar with this settings and don’t know how to turn the notification emails off (or on). It’s helpful to provide participants with the instructions of how-to do so (among other features) in an orientation or introduction to the course

Screen Shot 2014-07-12 at 3.37.18 PM

The forums with large numbers of participants can be overwhelming to the point that there is little opportunity for reflection or deep discussions. In closed online classes it helps to have focused discussion questions per thread, and if more than 20 participants to break the class into smaller groups

Screen Shot 2014-07-12 at 4.20.15 PM

Google hangouts and other platforms used for synchronous events, are not immune to technological glitches. A practice run prior to the event helps (granted even still, problems occur),  and having a back-up plan is recommended

2) Poor Usability and Course Design
Usability refers to how effectively students can navigate, interact and engage with the course interface, find the content they need, determine what they need to do to engage, etc. How user-friendly the course is (or is not) is a function of how the course content and pages are organized, what is featured on the course home page for example, or where the course announcements show up, even how the course tabs appear in the navigation menu. Usability falls under the umbrella of course design; it is a component with its own principles and guidelines that impacts the students overall course experience and learning outcomes in online spaces. Usability adds another layer of complexity to designing learning experiences mostly due to the newness of online platforms as delivery mediums for education.

Course design is a broad and deep topic, which I can’t address at all adequately in this post, but below are some examples that are representative of issues that frustrate students, and can deter learning outcomes that have to do with how an assignment’s instructions are worded, presented to the student, or even designed in the first place.

Screen Shot 2014-07-12 at 3.56.30 PM

Instructions for student assignments or activities need to be written with exceptional clarity. This means expanding on details is necessary, including examples, and reinforcing instructions and expectations via course announcements or live sessions when the course is in session. Another issue is the use of  consistent terminology throughout the course.  In this above example ‘thread’ and ‘post’ were used interchangeably, when in fact they mean different things, thus confusing the students.

Screen Shot 2014-07-12 at 4.50.56 PM

Another example of student confusion when there is inconsistency or conflicting information in the course

Screen Shot 2014-07-12 at 6.52.58 PM

Frequently, it is student assignments that generate the most confusion among students in virtual formats, often due to unclear instructions, or those that require students to use technical applications (e.g. to create a digital artifact) that they are not familiar with

3) Anonymity
A view on participants posting anonymously within a MOOC from iVersity:

“MOOCs offer an environment that may engage introverts. Online anonymity can make students comfortable expressing themselves in forums…participating in course conversations online may give students confidence to contribute in traditional classrooms and work environments.” iVersity blog post on Anonymity

I disagree with iVersity’s position, and with Coursera and edX, which both allow anonymous posting within discussion forums. Anonymity does not contribute to effective online learning communities such as MOOCs; it’s counterintuitive to the premise of a learning within a community, where the idea is that learners actively engage, and learn with, and from each together. Several papers have identified the benefits of learning communities in distributed (online) learning environments (Dede, 2004), with some emphasizing the value of communities in MOOCs especially (Kop, Fournier & Mak, 2006). What is consistent in the research is the idea of trust and a set of common values or goals among learners.

The type of support structure that would engage learners in critical learning on an open network should be based on the creation of a place or community where people feel comfortable, trusted and valued, and where people can access and interact with resources and each other. (Kop, Fournier & Mak, 2006, p. 88)

Learning in a virtual community, where students go outside of their comfort zone, are challenged to consider alternative perspectives and build a personal learning network for example, requires a level of rapport, familiarity and trust between classmates and instructors. This sense of community can and does happen in small, closed learning environments, and in cMOOC learning communities, but experiencing a sense of community in xMOOCs is far more difficult to accomplish with many of variables making it so, anonymity is just one. With this learning approach (and others) assumed by MOOC platform providers, I see xMOOCs destined to be static resources posted on the web—open courseware such as MIT OCW.

Closing Thoughts
Learning in xMOOCs is far more complex than what the MOOC platforms seem to be able to address. Low completion rates are just one metric of how students’ views of MOOCs are at odds with what the expectations of the MOOC providers. The three barriers discussed here, technology, usability and anonymity are just one piece of a bigger problem that MOOC platform providers will need to address if they are interested in creating a communities of learning where students actively engage, contribute and learn.

Further reading:

What Marshall McLuhan’s ‘Global Village’ Tells Us About Education Technology in 2014

imgresThe Global Village: Transformations in World Life and Media in the 21st Century (1989) published posthumously, is one of Marshall McLuhan’s best works. It’s quite remarkable how this book published over twenty years ago, provides the reader with a contemplative perspective on the role of technology in 2014. While reading, I found myself thinking about educational technology quite differently—thinking more about the effects, nuances, and implications technology has beyond education. Effects on relationships, learning (and teaching) in the context of our culture and long-term implications for society in general. The book is about far more than education, it delves into technology and its influence on communication patterns, family structures, and entertainment. The book prompts reflection and forward thinking at the same time.

McLuhan was a Canadian philosopher and educator of communication theory, and considered a public intellectual of his time. His work on media theory is still studied today. The Global Village was a culmination of his years of work on media, a collaborative effort between McLuhan and long-time friend and colleague Bruce Powers. It summarizes McLuhan’s lifelong exploration and analysis of media, culture and man’s relationship with technology.

250px-MediaTetrad.svg

McLuhan designed the tetrad as a pedagogical tool to of examine the effects on society of any technology/medium by dividing its effects into four categories and displaying them simultaneously.

McLuhan and Powers introduce a framework for analyzing media via a tetrad. A tetrad is any set of four things; McLuhan uses the tetrad as a pedagogical tool for examining an artifact or concept (not necessarily a communication medium) through a metaphoric lens, which according to McLuhan translates to “two grounds and two figures in dynamic and analogical relationship to each other”. You can see how the idea can stretch one’s cognitive processes. The framework began to make more sense to me when reviewing the tetradic glossary at the end of the book which examines twenty or more ideas and artifacts through the tetrad framework, including periodic tables, a clock, cable television, and the telephone.  McLuhan designed four questions to explore a medium under analysis using the tetrad framework:

  1. What does the medium enhance?
  2. What does the medium make obsolete?
  3. What does the medium retrieve that had been obsolesced earlier?
  4. What does the medium flip into when pushed to extremes?

Wouldn’t it be challenging in a media and communications class, or even in an education theory class to have students apply the tetrad structure to current technological tools and applications? How might the iPhone be viewed? Or Twitter? Though provoking to say the least.

The more I read of McLuhan’s work, and about McLuhan himself, the more I believe this man was a genius. He predicts not only events, but how media tools and advancements in technology affect society as a whole—that no one could have imagined or even considered in the 70’s and 80’s. Yet McLuhan could almost see into the future, see how our society is shaped and influenced good and bad by technology.

Worthwhile [short] Clips to Watch on McLuhan’s Views on Technology

Screen Shot 2014-06-09 at 12.36.00 PM

‘Technology is not just a happenstance…’ video clip via marshallmcluhanspeaks.com

Further Reading:

How and Why Institutions are Engaging with MOOCs…Answers in Report “MOOCs: Expectations and Reality”

How do institutions use MOOCs; and to what end?  •  Why do institutions pay thousands of dollars to develop and offer a MOOC on an external platform?  •  How do institutions determine the effectiveness of their MOOC efforts?  •  What are the costs associated with producing and delivering a MOOC?

All good questions; questions that policymakers, administrators and other stakeholders within higher education institutions that are considering MOOCs or already engaged with, want [or should want] answers to. The 200+ page report “MOOCs: Expectations and Reality” by Hollands and Tirthali of Columbia University attempt to answer these questions by surveying 83 faculty members, administrators, researchers and other actors within 69 education institutions. The report delivers on the promise of its title—how and why institutions engage, and provides the reader with even more insights.

The report is meaty, worthy of review for anyone with a vested interest in MOOCs of any type. In this post I provide a brief overview of the report, but focus specifically on one aspect of the ‘how‘. I highlight the resources required to develop a MOOC—how many people it requires, the job titles, the [estimated] costs associated with development. This may be useful for readers considering developing a MOOC for a platform such as Coursera or another, or for a cMOOC using a collective course design approach. This report brings into focus just how resource-hungry MOOCs are, and after reading the report, readers considering developing or contributing to the development of a MOOC might feel enlightened, encouraged, or perhaps even discouraged; at the very least, will have a better understanding of MOOCs and their place in higher education institutions.

 Overview

Who sponsored the report?  The Center for Benefit-Cost Studies of Education (CBCSE), a research center at the Teachers College at Columbia University. The mission of the center is “to improve the efficiency with which public and private resources are employed in education“.  Note: the report is open and available for download.

Purpose of the Study: Given the work of the CBCSE, and its pursuit of improvement of cost efficiency in education, the report is an extension of its mission. The purpose as outlined in the report, “the study serves as an exploration of  the goals of institutions creating or adopting MOOCs and how these institutions define effectiveness of  their MOOC initiatives“.

Screen Shot 2014-05-19 at 9.50.39 PM

Figure 1 ‘MOOCs: Expectations and Reality’ (p 22)

Report Snapshot:  The report sample includes 83 administrators, faculty members and researchers, all of which were interviewed, at 62 institutions. The institutions: public and private universities, community colleges, platform providers, research organizations, for-profit education companies and a selection of institutions deemed ‘other’ including one museum (p 180). Of the 62 institutions in the sample, 29 at the time of the study were offering or using MOOCs in some way; the remaining were either not participating or taking a wait-and-see position.

Why a MOOC? One of the reasons this report is instructive for the education community is the inclusion of the data about why institutions offer MOOCs. Many have asked why some institutions (several public higher education institutions) have spent thousands of dollars, invested considerable resources into this method of education delivery to the masses that has yet to be evaluated and tested for effectiveness. The chart below summarizes the six reasons identified.

Screen Shot 2014-05-20 at 8.25.43 AM

‘MOOCs: Expectations and Reality’ (Hollands & Tirthali, p 8)

The above table is merely a snapshot.  Each goal is described in further detail within the report. A case study featuring an institution accompanies each which gives a contextual example of the reasons.

A snapshot of How? MOOCs are resource intensive efforts, and the report validates this. Development of a MOOC, and the facilitation of the course once its live (accessible to students) requires significant amounts of time and energy from individuals across several departments within the institution. The faculty member (or members) acting as the subject matter expert for the MOOC requires a team, each with different areas of expertise to support him or her in bringing the content to life and creating an environment of learning for hundreds, if not thousands of course participants.

“Number of faculty members, administrators, and instructional support personnel involved MOOC production teams seldom included fewer than five professionals and, in at least one instance described to us, over 30 people were involved. Faculty members typically reported spending several hundred hours in the production and delivery of a single MOOC” (p 11)

Example of Human Resources Requirements: Case Study 11

Case study 11 provides an excellent example of the commitment of resources needed for developing a course for a MOOC platform which in this example is Coursera. The institution in the case is an unnamed MidWestern University (p 144). The school invited faculty with prior media experience to develop a five to eight week MOOC. This study is representative of the human resources required for development of a MOOC.

Human resources requirements by job title for course development of a MOOC:

2 x Faculty Members: (Subject Matter Experts)

1 x Project Manager: Leads the project, coordinates all elements of development. Liaise with departments as needed within the institution. Manages the project timetable; keeps project on time and on budget

4 x Curriculum Design Team Instructional Designer (works with faculty to present course content and create a learning environment with it on the course home page). • Instructional Technologist (works with instructional designer) • Video Production Liaison (works with faculty member in production of videos, and liaise with video production team)

5 x Video Production Team:  Production Manager •  Camera operators/equipment technicians • Audio-technician

In this case study, videos were produced at a high quality, using a full video design team. The final costs were calculated using records from the institutions, though the report authors made some estimates due to lack of detail on some aspects of human resource inputs.

Screen Shot 2014-05-20 at 8.59.34 AM

‘MOOCs: Realities and Expectations’ (p 144). One of the two data tables accompanying case study #11. Table 7 gives the range of hours spent on MOOC design (p 144)

Lecture Videos: Costs and Student Engagement
One of the primary drivers of costs in MOOC development (for platforms such as Coursera, FutureLearn, etc.) is video production. The more complex the video, for instance addition of graphics, multiple cameras used for shooting, post-filming editing, the higher the costs. Low-tech efforts,  where there might be one camera person, or even the faculty member self-recording on his or her laptop requires far fewer resources.  Some institutions seek a higher quality finished product, which in turn demands a high level of production using a team of video professionals. Accordingly, the costs vary dramatically. ‘MOOCs: Expectation and Realities’ estimates high quality video production at $4,300 per hour of finished video (p 11).

One may be tempted to think that the higher the video quality, the better the learning outcomes. However a report published recently by EDUCAUSE, What Makes an Online Instructional Video Compelling? suggests that students engagement with videos relies upon several factors, including whether or not the video links to an assignment within the course. Furthermore, the average viewing time of videos is less than five minutes (Hibbert, 2014). What this suggests is that videos presenting content must be carefully and strategically planned for during the course development phase, and tied closely to the instructional strategy. Higher production costs does not necessarily mean higher student engagement or learning outcomes.

Closing Thoughts

The report discussed here, ‘MOOCs: Expectation and Realities’ is an important contribution to the MOOC discussion in higher education. In my opinion one of the greatest benefits of the report is the spotlight it puts on the resources required for developing a MOOC, in contrast to the reasons why institutions engage with MOOCs. When one examines closely the reasons, it appears that the amount of resources invested, in some cases is extreme. I agree with the authors in the point they make in the executive summary,

” [we]…conclude that most institutions are not yet making any rigorous attempt to assess whether MOOCs are more or less effective that other strategies to achieve these goals” (p 11).

I’ll add one more point to this, and that’s the need for a complete and comprehensive approach to course design, (applicable to any course) that involves from the beginning, a thorough needs analysis that determines the goals of the organization and how the [potential] course fits into it. It’s only after this analysis that the course design process can proceed.

References:

Hibbert, M. (2014). What Makes an Online Instructional Video Compelling?. EDUCAUSE Review Online. Retrieved from: http://www.educause.edu/ero/article/what-makes-online-instructional-video-compelling

Hollands, F. M., & Tirthali, D. (2014). MOOCs: expectations and reality. Full report. Center for Benefit- Cost Studies of Education, Teachers College, Columbia University, NY.  Retrieved from: http://cbcse.org/wordpress/wp- content/uploads/2014/05/MOOCs_Expectations_and_Reality.pdf

MOOC Design Tips: Maximizing the Value of Video Lectures

“Which kinds of videos lead to the best student learning outcomes in a MOOC?”
How Video Production Affects Student Engagement: An Empirical Study of MOOC Videos (Guo, Kim & Rubin, 2014)

Reel of FilmAn excellent question that design teams and instructors of MOOCs want to know—which kinds of videos lead to the best student learning outcomes in a MOOC?  According to a recent study conducted by researchers for the edX MOOC platform, this was the most pressing question posed by the course design teams working with its partner institutions. Given that most MOOCs offered through higher education institutions platforms such as edX, iVersity, or Coursera use video lectures as the primary content delivery source, it is a critical question that preoccupies many if not most MOOC instructional design teams. Adding to this need-to-know element is the fact that video production is most often the highest cost associated with MOOC production. MOOC video production can range from a few hundred dollars and run up to the thousands. This post suggests how institution can use resources effectively in the video production process with the primary goal of supporting students’ learning outcomes.

The report released by edX last week gives design teams some concrete data to examine. I’ve emphasized below the recommendations and practical application points from the paper for readers who might be part of a design team for MOOC, online course, or for those with an interest in video production for instructional videos.  There are limitations to the study outlined in the paper, though the depth of the analysis does provide data worthy of consideration. 

The report, the first of its kind according to the authors Guo, Kim & Rubin, analyzes students’ engagement* with lecture videos gathered from data extracted from over 6.9 million video watching sessions across four edX courses.  *Student engagement is defined in the study by:

  1. Engagement time: the length of time that a student spends of a video. This is the same metric used by YouTube. Though researchers acknowledged the limitation of engagement assessed from this one-dimensional perspective.
  2. Question/Problem Attempt:  Almost one-third of the videos across the four courses featured an assessment problem directly following the video, usually a multiple-choice question designed to check a student’s understanding of the video’s contents. “We record whether a student attempted the follow-up problem within 30 minutes after watching a video.”

Videos Types for MOOCs
Lectures are divided into two primary types for the study, [which mirrors most MOOCs]: 1) lecture videos for content delivery—presented via an instructor/professor (‘talking head’ is the term used in the paper), and 2) a tutorial/demonstration, a step-by-step problem solving walk-through, common in computer science courses, courses featuring mathematical concepts or science courses featuring lab demonstrations.

Video Production Format
For analysis purposes, researchers coded the videos examined in study using six primary video production formats, which I’ve summarized below, along with production styles not mentioned in the study.

1) Lecture-Style Video Formats:

  • Instructor(s) with/without Presentation Slides: Features instructor(s) lecturing, with or without PowerPoint slide presentation slides inserted throughout with instructor ‘voice over’ while slide is displayed
  • Office Setting: close-up shots of the instructor filmed at his or her office, typically instructor speaks directly to camera
  • Classroom Setting: video captured from a live classroom lecture
  • Production Studio Setting: instructor recorded in a studio with no audience, typically speaking to the camera

2) Tutorial/Demonstration Video Formats:

  • Video Screencast: of the instructor demonstrating a concept, i.e. writing code in a text editor, or command-line prompt (in the case of computer science courses), using spreadsheet or document
  • Instructor Drawing Freehand on a Digital Tablet, using a software program, which is a style popularized by Khan Academy videos (click here to view an example)

Other Formats not mentioned in the study:

  • Instructor interviewing another expert or guest speaker
  • Instructor delivering lecture in another setting related to the course (though not always), for example an ecologist giving lecture at the beach, an art historian in a museum, etc.
  • Panel Discussion of experts on specific course-related topic

Which format to use? The primary factor that determines which format to use are the objectives of the MOOC or course, and the course content. The course design team typically selects the video formats during the course design phase when the instructional strategy is created, for example: the formats of the video are chosen, the content chosen for each, related student activities or assessments selected, etc.

The second factor determining which format to employ is the amount of resources (dollars) available for video production. This determines right off the bat which tool, program or hardware will be used for the video production. Important to note, the amount of resources invested in video production does not scale to how much students’ learn or to MOOC completion rates. For example, I completed a course on Canvas Network, Statistics in Education for Mere Mortals (my course review here). The course featured video lectures and tutorials, all created by the instructor using low-budget technology. Lectures appeared to be filmed on the instructor’s laptop using a web cam, (power point slides were added, so there was some editing). Each module featured a tutorial, a screen cast where the instructor demonstrated application of various formulas to a data set. I found the professor, Lloyd Rieber, encouraging and personable; he also delivered the content concisely in lecture videos and tutorials. Interestingly, the course completion rate was over 10%, higher than typical MOOC completion rates that are usually lower than 7%.

Key Findings of Study

  • Shorter videos are more engaging. Student engagement levels drop sharply after 6 minutes
  • Engagement patterns differ between the two video formats; engagement higher with the lecture style videos (‘talking head’) which researchers suggest is due to more “intimate and personal feel”
  • Several MOOC instructors interviewed for study felt more comfortable with the classroom lecture format, however this format did not translate well online, even with much editing in production studio
  • For tutorial/demonstrations videos, the Khan-style format where instructor draws on tablet and narrates, was found to engage students more effectively than screen casts. A contributing factor—instructors ability to situate themselves “on the same level” as student
  • Video producers and edX design teams determined that pre-production planning had the largest impact on the engagement effect of the videos. Researchers used a data set within the study to test this idea

Practical Recommendations for Course Design Teams

  1. Identify type and format for each video lecture using course objectives and module breakdown as a guide, and budget. Plan each lecture for the MOOC format and its potential students. Consider copyright terms for images used in videos and slides. Plan ahead by selecting appropriate images, free from copyright during the planning phase
  2. Invest in pre-production planning phase. Segment course content into chunks, using six-minutes per video as a guideline. Identify purpose for each video lecture, and key content points to deliver within each.  Write script for each [lecture video format] and have instructor practice before filming—reduces filming and editing time

  3. For tutorial/demonstration videos introduce motion and continuous visual flow into tutorials, along with extemporaneous speaking so that students can follow along with the instructor’s thought process. Complete basic outline of video beforehand, not full script to be read word-for-word
  4. Provide more personal feel to videos. Try filming in an informal setting (such as the instructor’s  office) where he or she can make good eye contactit often costs less and might be more effective than a professional studio. Coach instructors to use humour, personal stories and convey enthusiasm where possible

Closing Thoughts
MOOCs are here to stay, which makes studies like this one valuable for helping educators be more effective through course design. This study brings us closer to finding the answer to the question which kinds of videos lead to the best student learning outcomes in a MOOC?  Yet it’s a start, there is still much more to be done in understanding how students learn in massive courses, and how institutions can be more effective with investment of its resources for increasing student learning outcomes.

Further Reading:

Nicolas Carr on ‘Social Physics’…The Darker Side of Reality Mining

BigDataImageIt’s this article ‘The Limits of Social Engineering that piqued my interest this week, first because of the image featured in the article which I found appealing, then it was the reference made to Marshall McLuhan, a scholar and author I admire greatly, and finally because it was by Nicolas Carr, author of the book, “The Shallows” which I reviewed this week on my blog. But it’s the article’s unusual topic that grabbed hold of me by the collar and motivated me to share it with readers—something called ‘reality mining’.  Reality mining is an advanced branch of data mining and is central to the book “Social Physics: How Good Ideas Spread—The Lessons from a New Science that Carr reviews and draws from in his article. Carr provides a good overview of not just the book, but of the science, and hints at the potential ills of reality mining, or as the book’s author calls it ‘social physics’ (or ‘mislabeled’ it as several reviewers of the book on Amazon claim). With reality mining researchers and scientists create algorithmic models using ‘big data’ generated by human movements and behaviours tracked by mobile phones, GPS, wearable tech or tracking devices to analyze and predict social and civic behaviour. Reality mining, with the expansion of mobile phone penetration globally in the past year and now wearable internet enabled devices, is likely the next big thing in data mining. Already many experts extol the virtues of reality mining and what it can do for institutions, society and the public good. As quoted on the book’s website:

John Seely Brown, Former Chief Scientist, Xerox Corporation and director of Xerox Palo Alto Research Center (PARC):

“Read this book and you will look at tomorrow differently. Reality mining is just the first step on an exciting new journey. Social Physics opens up the imagination to what might now be measurable and modifiable. It also hints at what may lie beyond Adam Smith’s invisible hand in helping groups, organizations and societies reach new levels of meaning creation. This is not just social analytics. It also offers pragmatic ways forward.”  socialphysics.media.mit.edu/book

We can already catch a glimpse of reality mining in businesses and organizations taking shape. The WSJ featured an article this week by Deloitte that describes the target market for wearable devices which is not consumers, but organizations or ‘enterprise’.  It seems there is unlimited potential for fitting employees with these wearable tech devices to gather data to support better decision-making at the workplace.

Reality mining takes Big Data to a new level, and as Carr emphasizes Big Data can and likely will be used to manipulate our behavior. It’s the idea of manipulation in this context that is disturbing.  Several questions come to mind like this one—who makes the decisions on the actions to take to manipulate a society’s behaviour? And, based on what values?

Below researchers describe how behaviour can be manipulated, as excerpted from “Social Physics” within Carr’s article:

book-cover-hi-res-2-crop-1

Author of “Social Physics”, Alex Pentland will be teaching “Big Data and Social Physics” via the edX platform. Start date: May 12, 2014

“They go into a business and give each employee an electronic ID card, called a “sociometric badge,” that hangs from the neck and communicates with the badges worn by colleagues. Incorporating microphones, location sensors, and accelerometers, the badges monitor where people go and whom they talk with, taking note of their tone of voice and even their body language. The devices are able to measure not only the chains of communication and influence within an organization but also “personal energy levels” and traits such as “extraversion and empathy.” In one such study of a bank’s call center, the researchers discovered that productivity could be increased simply by tweaking the coffee-break schedule.”

Closing Thoughts
Like Carr, I too am somewhat wary of reality mining, or ‘social physics’.  Though in examining Marshall McLuhan’s works, who Carr refers to in the opening of his article, I find wisdom in McLuhan’s words that so accurately describe what is happening now—within the realm of big data for instance.  The website managed by McLuhan’s estate includes snippets of interviews, quotes and links to his works that are worthy of perusing and pondering. I found the quote below applicable and insightful when considered in context of reality mining.

In the electric age, when our central nervous system is technologically extended to involve us in the whole of mankind and to incorporate the whole of mankind in us, we necessarily participate, in-depth, in the consequences of our every action. It is no longer possible to adopt the aloof and dissociated role of the literate Westerner.”  Understanding Media: The Extensions of Man, (p 4)

Worth pondering, is it not?

Further Reading