How Interactive is Your Online Course? Self-Assess with this Rubric

Online instructors and course designers can enhance existing online courses and create active, engaging courses by considering five elements included in an adapted version of Robyler and Ekhamil’s “Rubric for Assessing Interactive Qualities of Distance Courses” described (and embedded) below. 

iStock_000019623568XSmall

Interactivity is a much discussed topic in online learning. It’s considered the essential ingredient for quality learning. It’s also considered the missing element in online learning—an element that critics claim make face-to-face learning superior. There is no question that interactivity is a necessary component of online, for-credit education. Three out of seven principles presented in Chickering and Gamson’s seminal paper “Seven Principles for Good Practice in Undergraduate Education(1987) stress interaction and active learning: Principle 1. encourage contact between students and faculty, 2. develop reciprocity and cooperation among students, and 3. encourage active learning. Chickering and Gamson’s principles are just as relevant to online education as they are to face-to-face instruction. Also worth noting is that several institutions use these same principles as a foundation for their best practices in both traditional and online education today.

Few would argue that interactivity is necessary for quality online education, yet many educators are unsure how to make an online course interactive. Adding to the challenge is the fact that there are few resources outlining strategies and examples on how to go about developing a course that stresses active learning.

The Rubric
Fortunately there is an excellent, instructive tool that serves as a starting point, “How Interactive are YOUR Distance Courses? A Rubric for Assessing Interaction in Distance Learning”.  I like this resource because of the clear language it uses, the specificity of behaviours and its self-scoring capabilities. The rubric below is based on concepts of the original rubric published in Robyler and Ekhamil’s paper. The revised rubric adds a fifth element ‘Evidence of Instructor Engagement’ to the existing four, where each element defines interactive qualities of an online course. The updated version further develops each element—an improvement over the original; the elements are now worded so they are specific to interactive qualities brought about by: 

  • course design (element 1 and 2)
  • technology support function (element 3)
  • facilitation of the course (element 4 and 5).

The three-page Rubric embedded below is a PDF in Google Docs (hover your cursor over the right corner to expand the Rubric). If unable to view the embedded file, click here to go directly to the doc on Google Drive.

Are we measuring Interactivity or Interaction?
There is a critical distinction between interactivity and interaction in the context of online education. It’s important to clarify—one concept involves technology and the other human behaviors. Wagner in Interactivity: From Agents to Outcomes (1997) describes interactivity as involving attributes associated with a technological application that delivers an interactive experience to learners, e.g. an interactive timeline embedded within a course home site, or a multiple choice quiz that gives automated feedback. On the other hand interactions usually involve human behaviours of individuals or groups that influence one another (Wagner, 1997). Discussion within a forum where there is exchange between students is an example, an email exchange between student and instructor, or a live video conference chat are others. As Wagner discusses in her paper, the differences are noteworthy, and relevant today as the term interactive is often used without clarification when describing online education courses in discussions for assessment and accreditation purposes.

Conclusion
Creating and facilitating an online class that is interactive—that promotes student activity and engagement is challenging and complex. There are many variables involved; several beyond the control of the instructors and course development team. The rubric presented here does provide a good starting point for considering some of the factors that contribute to creating active and meaningful learning experiences for students. If you have or use resources or strategies that are helpful for creating active learning, consider sharing by leaving a comment so other readers may benefit. Thank you!

References

How and Why Institutions are Engaging with MOOCs…Answers in Report “MOOCs: Expectations and Reality”

How do institutions use MOOCs; and to what end?  •  Why do institutions pay thousands of dollars to develop and offer a MOOC on an external platform?  •  How do institutions determine the effectiveness of their MOOC efforts?  •  What are the costs associated with producing and delivering a MOOC?

All good questions; questions that policymakers, administrators and other stakeholders within higher education institutions that are considering MOOCs or already engaged with, want [or should want] answers to. The 200+ page report “MOOCs: Expectations and Reality” by Hollands and Tirthali of Columbia University attempt to answer these questions by surveying 83 faculty members, administrators, researchers and other actors within 69 education institutions. The report delivers on the promise of its title—how and why institutions engage, and provides the reader with even more insights.

The report is meaty, worthy of review for anyone with a vested interest in MOOCs of any type. In this post I provide a brief overview of the report, but focus specifically on one aspect of the ‘how‘. I highlight the resources required to develop a MOOC—how many people it requires, the job titles, the [estimated] costs associated with development. This may be useful for readers considering developing a MOOC for a platform such as Coursera or another, or for a cMOOC using a collective course design approach. This report brings into focus just how resource-hungry MOOCs are, and after reading the report, readers considering developing or contributing to the development of a MOOC might feel enlightened, encouraged, or perhaps even discouraged; at the very least, will have a better understanding of MOOCs and their place in higher education institutions.

 Overview

Who sponsored the report?  The Center for Benefit-Cost Studies of Education (CBCSE), a research center at the Teachers College at Columbia University. The mission of the center is “to improve the efficiency with which public and private resources are employed in education“.  Note: the report is open and available for download.

Purpose of the Study: Given the work of the CBCSE, and its pursuit of improvement of cost efficiency in education, the report is an extension of its mission. The purpose as outlined in the report, “the study serves as an exploration of  the goals of institutions creating or adopting MOOCs and how these institutions define effectiveness of  their MOOC initiatives“.

Screen Shot 2014-05-19 at 9.50.39 PM
Figure 1 ‘MOOCs: Expectations and Reality’ (p 22)

Report Snapshot:  The report sample includes 83 administrators, faculty members and researchers, all of which were interviewed, at 62 institutions. The institutions: public and private universities, community colleges, platform providers, research organizations, for-profit education companies and a selection of institutions deemed ‘other’ including one museum (p 180). Of the 62 institutions in the sample, 29 at the time of the study were offering or using MOOCs in some way; the remaining were either not participating or taking a wait-and-see position.

Why a MOOC? One of the reasons this report is instructive for the education community is the inclusion of the data about why institutions offer MOOCs. Many have asked why some institutions (several public higher education institutions) have spent thousands of dollars, invested considerable resources into this method of education delivery to the masses that has yet to be evaluated and tested for effectiveness. The chart below summarizes the six reasons identified.

Screen Shot 2014-05-20 at 8.25.43 AM
‘MOOCs: Expectations and Reality’ (Hollands & Tirthali, p 8)

The above table is merely a snapshot.  Each goal is described in further detail within the report. A case study featuring an institution accompanies each which gives a contextual example of the reasons.

A snapshot of How? MOOCs are resource intensive efforts, and the report validates this. Development of a MOOC, and the facilitation of the course once its live (accessible to students) requires significant amounts of time and energy from individuals across several departments within the institution. The faculty member (or members) acting as the subject matter expert for the MOOC requires a team, each with different areas of expertise to support him or her in bringing the content to life and creating an environment of learning for hundreds, if not thousands of course participants.

“Number of faculty members, administrators, and instructional support personnel involved MOOC production teams seldom included fewer than five professionals and, in at least one instance described to us, over 30 people were involved. Faculty members typically reported spending several hundred hours in the production and delivery of a single MOOC” (p 11)

Example of Human Resources Requirements: Case Study 11

Case study 11 provides an excellent example of the commitment of resources needed for developing a course for a MOOC platform which in this example is Coursera. The institution in the case is an unnamed MidWestern University (p 144). The school invited faculty with prior media experience to develop a five to eight week MOOC. This study is representative of the human resources required for development of a MOOC.

Human resources requirements by job title for course development of a MOOC:

2 x Faculty Members: (Subject Matter Experts)

1 x Project Manager: Leads the project, coordinates all elements of development. Liaise with departments as needed within the institution. Manages the project timetable; keeps project on time and on budget

4 x Curriculum Design Team Instructional Designer (works with faculty to present course content and create a learning environment with it on the course home page). • Instructional Technologist (works with instructional designer) • Video Production Liaison (works with faculty member in production of videos, and liaise with video production team)

5 x Video Production Team:  Production Manager •  Camera operators/equipment technicians • Audio-technician

In this case study, videos were produced at a high quality, using a full video design team. The final costs were calculated using records from the institutions, though the report authors made some estimates due to lack of detail on some aspects of human resource inputs.

Screen Shot 2014-05-20 at 8.59.34 AM
‘MOOCs: Realities and Expectations’ (p 144). One of the two data tables accompanying case study #11. Table 7 gives the range of hours spent on MOOC design (p 144)

Lecture Videos: Costs and Student Engagement
One of the primary drivers of costs in MOOC development (for platforms such as Coursera, FutureLearn, etc.) is video production. The more complex the video, for instance addition of graphics, multiple cameras used for shooting, post-filming editing, the higher the costs. Low-tech efforts,  where there might be one camera person, or even the faculty member self-recording on his or her laptop requires far fewer resources.  Some institutions seek a higher quality finished product, which in turn demands a high level of production using a team of video professionals. Accordingly, the costs vary dramatically. ‘MOOCs: Expectation and Realities’ estimates high quality video production at $4,300 per hour of finished video (p 11).

One may be tempted to think that the higher the video quality, the better the learning outcomes. However a report published recently by EDUCAUSE, What Makes an Online Instructional Video Compelling? suggests that students engagement with videos relies upon several factors, including whether or not the video links to an assignment within the course. Furthermore, the average viewing time of videos is less than five minutes (Hibbert, 2014). What this suggests is that videos presenting content must be carefully and strategically planned for during the course development phase, and tied closely to the instructional strategy. Higher production costs does not necessarily mean higher student engagement or learning outcomes.

Closing Thoughts

The report discussed here, ‘MOOCs: Expectation and Realities’ is an important contribution to the MOOC discussion in higher education. In my opinion one of the greatest benefits of the report is the spotlight it puts on the resources required for developing a MOOC, in contrast to the reasons why institutions engage with MOOCs. When one examines closely the reasons, it appears that the amount of resources invested, in some cases is extreme. I agree with the authors in the point they make in the executive summary,

” [we]…conclude that most institutions are not yet making any rigorous attempt to assess whether MOOCs are more or less effective that other strategies to achieve these goals” (p 11).

I’ll add one more point to this, and that’s the need for a complete and comprehensive approach to course design, (applicable to any course) that involves from the beginning, a thorough needs analysis that determines the goals of the organization and how the [potential] course fits into it. It’s only after this analysis that the course design process can proceed.

References:

Hibbert, M. (2014). What Makes an Online Instructional Video Compelling?. EDUCAUSE Review Online. Retrieved from: http://www.educause.edu/ero/article/what-makes-online-instructional-video-compelling

Hollands, F. M., & Tirthali, D. (2014). MOOCs: expectations and reality. Full report. Center for Benefit- Cost Studies of Education, Teachers College, Columbia University, NY.  Retrieved from: http://cbcse.org/wordpress/wp- content/uploads/2014/05/MOOCs_Expectations_and_Reality.pdf

The Next-Big-Thing in Online Education…Learning in Real Time

This article examines the potential of synchronous communication in online education by analyzing the newest tools and platforms that facilitate real-time group communication, and the pedagogy associated with implementing synchronous communication tools into asynchronous learning environments.

synchronous communication in online courses
synchronous communication

Communicating in real-time from a distance has never been easier. There are numerous new platforms and applications (apps) available free-of-charge that are easy-to-use and facilitate seamless communication between geographically distant people with access to a smart phone or laptop. After reading a WSJ article reviewing several smart phone apps that facilitate real-time communication among small groups seamlessly, I realize that the time is coming where synchronous tools will bring online education to the next level. Over the last two years there’s been a flood of free apps and platforms on the market that break down distance barriers and allow people to communicate from their handheld mobile device, tablet or laptop. One example is group video conferencing. There are now several free web-conferencing tools for groups that also feature document and screen sharing, including Google Hangouts, newly launched appear.in (video conversations for up to 8 people), and meetings.io (also free). These platforms knock down the once insurmountable barriers for video conferencing use in education—barriers of student access, and technology that was cumbersome and expensive.

A key aspect of this is the consideration of approaches to capitalizing on the capacity of video communications to reduce isolation and increase personalization of learning experiences for distance students. Indeed there is now scope for the empowerment of distance learners and an opportunity to offer a much wider choice of strategies intended to enhance and support learning (Smyth & Zanetic, 2007). Indications from the research literature are exciting.(Andrews, Tynan, Smyth &  Vale, 2010)

However, one significant barrier still exists when considering synchronous tools for education settings, and that is pedagogy.  From the same paper as the above paragraph, is this statement that describes the barrier crisply, “from a practitioners point-of-view, the challenge will come from the need to be flexible, adaptive and innovative. In other words, the need is to rapidly develop new understandings of pedagogies to best utilize the person-to-person interactivity of emerging technologies” (Andrews et al, 2010).

hangouts
A group of students in an online hangout using the platform meetings.io

The Great Potential: Synchronous Tools for Online Education
These apps and platforms hold great potential for online education—seamless real-time chats, video discussions that can facilitate peer-to-peer, and educator-to-student(s) exchanges that foster social connections, learning support, feedback or create a space for discussion of concepts and ideas in a way the asynchronous communication cannot. The new technology brings with it numerous possibilities. But though the potential is great, so are the challenges associated with implementation. As with any educational technology tool, the purpose for using the tool has to make sense, has to fit in with the curriculum in a pedagogically sound way that supports learning and achievement of the course objectives.

Although video conferencing has been around for some years, in many cases the use has not been informed by rigorous research leading to sound pedagogical practices. videoconferencing has frequently copied typical lecture style format of didactic lecture style delivery rather than exploring approaches….” (Andrews & Klease, 2002)

How-to Implement Educational Technology, i.e. Synchronous Tools
Before getting to highlights of the research addressing synchronous tools in online education, I’ll emphasize what needs to happen prior to implementing educational technology into a learning environment, which essentially is a needs analysis. The first step is asking questions—questions such as, “what educational problem are we trying to solve? what method can we apply that supports the problem? what tool will best work for the application that works within the learning context?“.

To be more specific with regards to implementation of synchronous tools as discussed here, the question might be, “How can a synchronous tool be used to improve the learning outcomes, or solve a learning problem that is not being met within asynchronous online classes?

It’s the answers to these questions that guide the learning design process. The next steps are when the real work of course design begins, developing the learning strategy to meet learning objectives ideally by following a model of learning or instructional design [I write extensively about instructional design. A good post for readers interested in learning more about instructional design is “Start Here”: Instructional Design Models for Online Courses].

Learning Challenges Synchronous Tools Can Solve
Synchronous tools are not a given for each online course, it will depend upon a number of factors as determined during the course design process. Though to give readers an idea of the types of situations where synchronous tools may be used, I’ve included excerpts from Kansas State University’s webpage ‘synchronous course delivery‘ from its e-learning faculty modules site.  Note, that it’s not always the instructor that will use synchronous tools, but learning counselors, tutors, small groups of students and others.

“Online real-time may be used for a number of learning purposes. There may be a small window of time when an online class may access a digital lab; a simulation; …an interactive streamed event.

….to introduce learners starting a cohort-based program. … there may be icebreakers to help people connect online….

…for academic and professional advising and counseling. It may be used for group or expert critiques of student designs and e-portfolios.

….for student group work, collaborations, and study sessions. Learners may interact with each other for problem-solving, planning, co-design, or strategy sessions.

If there is not a need for synchronous learning, then it may well be better left alone.e-learning faculty modules, 2012

No Talking Heads
One of the papers I review here from the International Journal of Education Technology, provides sound advice based upon the research, and one worthy of highlighting is that synchronous tools should not be used as a one-way medium, a format where the instructor can deliver information in real-time, but instead be viewed as a vehicle that allows for the exchange of information, for accommodating three or four-way [or more] conversations that build learning, ideas and learners’ motivation. The synchronous communication medium should be reserved only for exchanges that support a course objective or other learning-related function that can’t be accomplished through asynchronous methods.

“In other words students find the talking head presentation to be undesirable. This finding is not a new one (Commeaux, 1995; Schiller & Mitchell, 1993)…” (Andrews & Klease, 2002)

Research Highlights
Below are a selection of highlights from the papers referenced in this post that outline the impact of, and considerations for synchronous methods used in online education.

1) Building Social and Teacher Presence: More than one study suggests that synchronous communication activities support the social needs of online students not typically met in the asynchronous format, “Social support is desirable as a way to foster knowledge work and collaborative learning; it provides an environment where communication is encouraged; e.g., anecdotes and personal experiences encourage trust, which foster receptive and creative learning environment” (Hranstinski, 2008).

Synchronous activities contribute to building of social presence, one of the three dimensions of the Community of Inquiry (CoI) framework, a  frequently referenced model that describes the conditions for optimal online learning experience (Garrison, Anderson, & Archer, 2000). When the three dimensions are present, teacher presence, social presence and cognitive presence, the student can experience deep and meaningful learning. Purposefully developed synchronous [and asynchronous] activities can contribute to building social and teaching presence as supported by the research cited here.

2) Group Size: The purpose of a group activity as determined by the course design process, will determine the appropriate group size as well as the best tool or platform to support it.

It is worth noting here that multi-point videoconferencing is most effective with small groups of students (20 to 25 across 3 or 4 sites) as stated by Mason, (1994) cited in Burke, London and Daunt (1997)…” (Andrews & Klease, 2002)

Video and hangout platforms each have a limit to the number of individuals participating at once, as do chat platforms, which again reinforces why the instructional strategy created in the course design process is critical. The meetings.io platform for instance allows up to five people per hangout, ideal for a small group discussion, while Google Hangout accommodates up to ten, which may be applicable for a meet-and-greet type session held at the beginning of a course.

Chat platforms, for example whatsapp, might be used effectively for group discussions, i.e. one question related to a course topic, where students contribute initial thoughts and exchange ideas, followed by an asynchronous forum discussion continuing the conversation.

3) Differences in Time Zones: One of the drawbacks of synchronous tools often cited is students living in different time zones, however in closed online courses for credit, this is not as much of an issue as massive courses that cater to a world-wide audience (though even in these instances, there are ways to accommodate learners in different time zones). In my personal experience with synchronous activities in closed, online classes, most students are willing to adjust their schedule to participate in synchronous activities, more so when activities have a clear purpose and appear worthy of students time.

“Students were willing to deal with the problems of time difference in order to take advantage of this opportunity, which, on this occasion, resulted in very early classes. Additionally, they liked the experience of interacting with a wider peer group and of learning from each other’s different knowledge-base and backgrounds.” (Andrews & Klease, 2002)

4 ) Instructor and Student Familiarity with Tool: As with any technology used in online education, familiarity with the technology is essential to establish the foundation for a successful learning outcome. The institution is responsible for providing professional development for faculty and instructors, and working with course designers/instructors to build-in course time for student practice with the tool, and make available resources that support students (and faculty) with the technological issues.

Resources:

References

Udacity & San Jose U Halt Online Course Experiment: Lack of Instructional Strategy Undermines Courses

“We want to fail fast, learn from it and move on.”
Mohammad Qayoumi, San Jose State University President (San Jose Mercury News)

sucess-600x398In his statement [above], San Jose State University’s President, Mohammad Qayoumi was referring to the dismal results from his university’s online course experiment with MOOC provider Udacity. The majority of students enrolled in one of the three courses in the pilot project failed; more students flunked the online courses than students in the same face-to-face courses at San Jose University [SJSU].

Qayoumi’s statement is unsettling. Quayomi appears to gloss over the fact the failure applies to students; students in pursuit of higher education. In his statement the university president sounds more like the CEO of a corporation than the leader of a public institution. Yet students are not consumers of a product that may be inconvenienced by a faulty design. In fairness to Qayoumi, his statement may have been taken out of context; he may have elaborated on adverse effects experienced by students. An alternative viewpoint to consider is that Qayoumi appears eager to learn from the mistakes made and move forward.

What is more pressing than Qayoumi’s comment however, is the problems within the three pilot courses as revealed by Thrun in a frank interview with EdSurge [Corcoran, 2013]. The problems were significant and numerous. Several students did not have access to computers to log onto the course site to complete their course work, there were “clerical” errors within the curriculum, poor communication with students, lack of clear expectations and lack of specific deadlines for assignments. It’s likely that the problems were exacerbated with the time crunch; Udacity and SJSU working independently [likely not collaboratively] to meet tight deadlines, resulting in a disjointed curriculum.

Background
This pilot project between Udacity and SJSU was initiated by Governor Jerry Brown last year when Brown contacted Sebastian Thrun co-founder of Udacity, and asked for his help. The CA state governor was looking for ways to improve access to impacted classes for college students in public universities in the state of California. The deal was struck between the two where Udacity would support three classes created by professors at SJSU—Developmental Math (entry-level math, Algebra review), College Algebra and Elementary Statistics. In the remedial math class, only 29% passed, versus 80% for in the face-to-face class, and 44% versus 74% passed in the Algebra course, and 51% students passed versus 74% in the same in-class Statistics course.

What was Missing?  A Course Design Strategy
It is not only disappointing, but surprising at how basic fundamental course design errors were made in the pilot courses; errors [and omissions] which could have been avoided had there been, i) instructional design support, ii) a team approach to course development, and iii) more time allocated to course development. Granted there was a host of issues that the program faced, not to mention the skill deficiencies of the remedial students. The program’s failure is not due to one specific omission or error, however I suggest that poor course design was a significant factor.  I’ll outline why below. By doing so, I hope to provide insight that may be helpful to readers who are currently, or planning to develop their own online courses. Perhaps even institutional leaders considering similar programs to SJSU may find something of value here.

In my experience with online course design, I’ve discovered that in order for online for-credit courses to be successful, a comprehensive instructional plan and strategy are essential. An instructional strategy is the blueprint for the course development process.  In addition to a strategy, a team approach to course design and implementation is required. There are many elements to the online classroom environment that require expertise unique to this educational delivery method. Professors are the subject matter experts, but are not necessarily the experts in web page layout and design for the course site, media selection and development, use of open educational resources, etc. Nor might they have the skills in online class instruction that are required to create a sense of community, presence, and a culture of learning in the online classroom.

Lessons Learned from Udacity & San Jose University. 
Thrun provided candid responses to EdSurge when discussing the program, sharing the downfalls and mistakes made. Readers will appreciate his honesty, and willingness to share and be open about the program’s challenges. This is a positive move; it allows other institutions and educators to learn from their mistakes. Below I present four fundamental principles that course designers and educators should consider when developing their own online for-credit course in light of what was done, or not done within the SJSU/Udacity pilot. After each I include Thrun’s feedback as outlined in the EdSurge interview.

1. Analyze learner and learner contexts. This is the first phase in the process of designing curriculum, yet likely the most neglected. However this step is essential; it shapes the direction of the course, and in some cases reveals that the course should not be developed in the first place as planned. The learner analysis phase involves examining students skill level [technical and educational], cultural background, attitudes and motivations for learning, etc. Another consideration with online learning—how will students access content and instruction?, i.e. what platform will be used [learning context]. Where Udacity/SJSU went wrongthey failed to analyze the learning context – how learners would access the course. And they failed to address the level of skill required for course completion: technical and prerequisite knowledge. As an outsider looking in, it appears that delivering a remedial course to this group of students may not have been the best course of action.

“As the class progressed, Udacity also realized that many of the students simply couldn’t get to a computer regularly enough. For some students, says Thrun, “there were none in the home, [and] even in school they couldn’t get the hours needed to make progress … It was actually a big deal”.

“When students did get to the online programs, even navigating the computer systems could be daunting. One of the questions that tutors were frequently asked was how to do exponential notation on a computer”.

“…many students lacked even elementary-school-level mathematics knowledge…”

2. Establish clear and concise assignment guidelines and expectations. This is critical to student success. Students require concise, detailed and thorough instructions for all assignments, activities, forum discussions, exams, etc.  This is often the most overlooked step in online course design. Where Udacity/SJSU went wrong: they did not provide enough guidance for students; students were likely confused, disorganized and frustrated by not knowing what assignments were due, or what was expected. Time management is always a challenge for online students – without deadlines in a for-credit course, students get behind and become overwhelmed, more so towards the end of the course. 

“Then the unanticipated problems started to crop up. When the courses started, two of the three classes didn’t give students precise deadlines for assignments. “We communicated our expectations poorly,” concedes Thrun”.

3. Establish upfront the channels and avenues of communication. Create instructor presence and a sense of community where students will be more likely to feel comfortable asking for help.  Students need to know at the start of the course what academic support is available, and how to access it. Essential to successful learning in the online environment is building a sense of community where students feel connected, and know where to go for academic help and technical support. Where Udacity/SJSU went wrong: they did not communicate that support was available, which was a tremendous oversight given that one of the courses was serving remedial students.

“Initially many students were unaware of the online tutors (who are real people) who were available online to help, 12 hours a day. But over the weeks, it became clear that the tutoring services were crucial”.

4. Allow adequate time for course development. Where Udacity/SJSU went wrong: The deal between Udacity and SJSU was complex and required extensive negotiation taking up much time. The time allocated to course development was less than adequate. The courses were not completed before the program launch, but were designed while the courses were in session.

“The professors creating the curriculum for the program didn’t have much time; they were still writing curriculum when the courses began. 

We had a whole bunch of clerical mistakes. In most cases we heard about it, and fixed it on the fly. It happens in the classroom as well.”

Conclusion
Despite the disappointing results from the San Jose pilot there may be value in the experiment if constructive dialogue results.  However, I am concerned about the students, the students that failed in the pilot project, especially those in the remedial courses. I am also concerned that the focus in the soon-to-be published follow-up report will focus on the decision-making process of the institution, and ‘who didn’t do what’ at the leadership level rather than on the students.  Also disturbing is the apparent lack of expertise in the online learning pedagogy, course design and student support systems in this program. If this serious shortcoming is not recognized as such, online learning in general might be regarded as ineffective and/or inappropriate for college students. There are numerous successful online and blended programs in higher education institutions throughout the United States and beyond. But the results of this complicated, high-profile pilot project may hinder the dialogue about online education and its role in higher education given the lack of expertise and course design errors inherent to it. However,  I do look forward to reading the results of the forthcoming analysis and report from SJSU and hope that it provides constructive dialogue for the decision makers involved.

Resources:

Posts on Instructional Design: