Tag Archives: Student engagement

More Essential and Helpful Resources for Online Instructors

This post features a collection of carefully selected resources specific to teaching online; geared to educators seeking skill development for creating meaningful online discussions, communicating effectively with students, and providing constructive feedback.

iStock_000018547848XSmallThis is the second article in a series featuring select instructional resources—I’m in the process of building a bank of resources accessible from this blog geared to educators seeking skill development in facilitating and designing online courses. Over time I’ll be adding to the Resources section with the goal of sharing high-quality, relevant and helpful resources. This post includes resources grouped by topic with a brief description of each, and an icon indicating its type. For the list of previously featured resources and/or for the icon legend please refer to the resources tab of this site.

IV. Personalized Instructor Feedback and Interactions with Online Students

The level of instructor involvement [or not] in online learning environments is a controversial topic in the education sector. With automated grading programs and LMS platforms that provide automated, yet ‘personalized’ feedback based on student response scores, log-on and key stroke patterns, a growing camp of educators are convinced that learning is not comprised in the absence of an instructor—and is even improved with programmed feedback. Intuitively, I disagree. I see the need for personal interaction and support from an experienced and interested educator. In this section I’ve included a collection of resources that support the premise that interaction and feedback are critical to student success.

pdf1. This literature review paper explores far more than instructor feedback and interaction in online learning spaces, yet it is worthy to include here given it addresses current research concerned with online learning effectiveness in terms of learners’ interactions with their instructors and classmates. The specifics can be found on pages five through eighteen: Learning Effectiveness Online: What the Research Tells Us, Swan, K (2003).

Videos2. Giving feedback and interacting with students in the online classroom is no less important in the virtual realm than in face-to-face; yet doing so requires instructors to be strategic and purposeful in their communication with students, and requires a different perspective. This three-minute clip, Interact with Students featuring the program chair from Penn State World Campus, summarizes how and why faculty involvement with students online differs from, and is just as crucial as in face-to-face classrooms.

blogicon3. I wrote a blog post, ‘Speaking’ to Students with Audio Feedback in Online Courses about providing feedback to online students using audio feedback for student assignments in place of written feedback. The idea came from a communications professor that I follow on Twitter who had great success with this method; her students loved it.  Apparently so do many other students [and instructors] based upon the feedback and reaction from readers. The post explains how-to give audio feedback and what tools to use. The comments within the post are also helpful.

Website Link4. This web article provides three solid strategies for communicating with online students, as a class and individually. Though not specific to skill development for educators, there is helpful information here including how to use a rubric for structuring feedback for students: How to Provide Fair and Effective Feedback in Asynchronous Courses, Gruenbaum, E. (2010).

V.  Fostering Asynchronous Student Discussions

pdf1. Asynchronous discussions that are incorporated into curriculum for online courses can build student engagement and support higher levels of achievement and learning. However in order that forum discussions are successful and not viewed as busy work by students, discussions must be thoughtfully planned before the course begins, and need to be facilitated and monitored once the course is underway. This peer-reviewed article provides the foundational knowledge that educators require to construct the conditions, parameters, and student guidelines for successful and meaningful synchronous discussions:  Essential Elements in Designing Online Discussions to Promote Cognitive Presence — A Practical Experience.

Videos2.  This six-minute video, Conducting effective online discussions from the COFA series Learning to Teaching Online, provides educators with skill development and strategies for managing and facilitating effective online discussions and how to engage students in the process. I can’t say enough about this series from COFA—skill development in a concise format, honed to specific topics, that can be accessed easily by educators for their own skill development when needed.

Website Link3. There are several essential elements inherent to successful asynchronous discussions, and this web article, 5 Tips for Hosting Online Class Discussions,  summarizes the five core elements, including the need to grade student contributions. From my experience, assigning a grade for discussion contributions is necessary to foster participation in for-credit classes, including using a rubric that outlines expectations which increases the chances for a higher quality level of contributions.

As mentioned previously, this is the second post where I’ve shared a set of resources, and I’ve been encouraged by the number of positive responses and excellent suggestions. Thank you! There’s more to come, and in my next post that features resources, I’ll share ones specific to instructional design and pedagogy.

Learning Analytics for Instructors Need to be Loud and Clear

Learning Analytics…less data more insight. Analytics primary task is not to report the past, but to help find the optimal path to the desired future. (Shum, 2012)

Learning analytics, [analyzing students’ online behaviour patterns to support learning improvement] is not about data collection, but helping learners and instructors make connections with the data. I attended a webinar this past week with Campus Technology, Grand Canyon University: How we are improving student outcomes using LoudAnalytics on the LoudCloud Ecosystem. Grand Canyon University of Arizona shared results from their learning analytics pilot project using LoudAnalytics from LoudCloud, a company which presents themselves as a learning ecosystem, the next-generation of learning management systems. In this post I identify what kind of analytic reports are essential and the most useful to course instructors, which are not, and why this is so. The findings in this post I gathered from the webinar and content from week four of the course Current/Future State of Higher Education.

Meaningful’ Data for the Instructor
I wrote a post last week that addressed how student data gathered from online behaviours from a school’s platform, can put the learner in the ‘driver’s seat’, essentially in control of his or her own learning. A dashboard which gives real-time info on a student’s ‘stats’, can be a visual tool to help learners reach their goals, identify problems and contribute to  motivation. However, what about the course instructor? What analytic tools are available through the LMS platform that can provide meaningful data, data that is consumable – in a usable form that encourages instructors to take action in real-time?

Grand Canyon University Webinar,  Slide #14

To the left is an example of a report from LoudAnalytics that displays data about students’ progress in a visual format. Students are represented by circles; the size of the circle representative of the hours spent on the course home page (interacting with course content, etc.) and the colour of each circle representing a letter grade. I see this as a ‘snapshot’ view of  students progress holistically, but don’t see this report on its own as providing ‘actionable’ data. Time spent within the LMS does not translate always to grades and engagement level, but is just one metric.

Grand Canyon University Webinar, Slide #47

The report to the right however, does appear to provide constructive data for the course instructor. When instructors consider the previous report and the one here, the instructor is able to do something with it. For example upon review, the instructor might want to reach out to student #2 (and potentially one or two others) with an email to the student that might read like this:

Dear [name of student], it appears that you have an assignment outstanding, and have not participated in a forum recently. I am concerned about your progress in the class. There are several resources available for support, …..”

There are limitations to this scenario I’ve described here, it is one-dimensional given we don’t have complete information, but the idea is that the indicators provided in this report are specific about student actions, or non-actions that give the instructor something to work with.

What Data is NOT Helpful
It is information about student actions, i.e. missing assignments, non-participation in discussion forums, low test grades, that is valuable for instructors, what I call ‘actionable’ data. Other data, such as number of times logged on to the course  home page, or the number of minutes spent within the platform, is not meaningful or of much practical use. I suggest that platform providers (i.e. Moodle LoudCloud etc.) consider generating reports that are focused and specific to the users needs (users defined within three groups: student, instructor and administrator). However, making too many reports available will detract from the value of the analytics. For example, the report below shows the time in minutes a student spent within the LoudCloud system, which gives a snapshot of student behaviour, but, I don’t see how this information is useful for the instructor. Perhaps it might be, if considered in conjunction with other reports, but then we get into data overload.

Grand Canyon University Webinar, Slide #48

Furthermore, just because we can measure something, doesn’t mean it is valuable or even useful. Another example is the program that Course Smart, the e-textbook provider is launching to give course instructors reports on student engagement. I wrote about this last week, yet I use this again as an example to show how reports are created from data that end up being inconsequential.

It [Course Smarts’ program] will track students’ behavior: how much time they spend reading, how many pages they view, and how many notes and highlights they make. That data will get crunched into an engagement score for each student. The idea is that faculty members can reach out to students showing low engagement (Parry, 2012).

I have a hard time imaging how instructors will use this information. The problem from the get-go is that Course Smart assumes that student engagement is defined by the number of electronic ‘notes’ made in the e-book and how long the student spends ‘reading’ the textbook. Not only is this logic flawed, but as one of my readers pointed out, it has a ‘big brother’ feel about it. I do agree, and I will be writing about the ethics of learning analytics next week.

Closing Thoughts
Learning analytics can be a powerful tool for instructors, yet only when meaningful data is compiled in such a way that it is user-friendly, relevant and actionable, in other words reports must be loud and clear.  LoudCloud is onto something here, I very much like their visual presentation. Yet LoudCloud and other LMS providers need to narrow down the number of analytic reports made available, customizing what they offer to the users needs. Make it clear, specific and meaningful.

Next post: Dream or Nightmare: The Ethics of Learning Analytics, Online Learning Insights

Grand Canyon University: How we are improving student outcomes using Loud Analytics on the Loud Cloud Ecosystem. (November 13, 2012) Campus Technology Webinar (now on demand)

LT-C2012 Learning Analytics Symposium, (2012),  Simon Buckingham Shum, Slideshare
Introduction to Learning and Knowledge Analytics Syllabus, (2011), An Open Course
Putting Learners in the Driver’s Seat, Online Learning Insights

How Learning Analytics Can Make Instructors More Effective in an Online Course

This is the first in a three-part series on learner analytics, cutting edge insight for the course instructor; how to assess student behaviours in an online course using the LMS data collection tools in order to provide more effective course design and instruction.

Most course instructors strive to create a class where students are engaged with the content, appear eager to learn and participate. The indicators of student engagement in a face-to-face class are straightforward enough, attendance, participation in class discussions, and/or visits to the instructor during his or her office hours.

Measuring student engagement in an online course is more complex. However with the current learning management systems [LMS] such as Moodle and Blackboard now in use in virtually all education institutions, there is a treasure trove of data on student behaviour. This data has the potential to tell a story about a student’s engagement, even predict student success within a course. Each click or ‘view’ of a web page or resource on the course homepage is recorded in the activity database along with the time spent on each. The LMS platform becomes not only a resource provider and virtual space for students, but a source of information for instructors about student behaviour and actions.

Consider the potential if course instructors could access and interpret the data collected on the actions of students in a few simple steps. The good news, this is not only possible but takes minimal time on the instructor side, yet reaps big rewards in terms of getting feedback on what is, and is not working within a course. Online instructors that can assess patterns of student behaviours and interactions with course content and learning activities, can be responsive and adjust their teaching style accordingly.

Correlation between Engagement and Student Performance
Before exploring further, identifying the purpose of measuring student engagement in terms of data analysis is necessary to frame the discussion. Several studies have determined that a strong relationship exists between students’ LMS usage and academic performance. California State University, Chico identified that the more time students spent on learning tasks within the LMS [‘dwell time’] along with a high number of visits to the course home page, was associated with higher student grades (Whitmer, Fernandes, & Allen, 2012). Another study conducted by scholars at Central Queensland University which used a sample population of 92,799 undergraduate online students, reported a statistically significant correlation between the number of student views on the course home page and students final grade (Beer, Clark and Jones, 2010). The more ‘views’ or visits to the course home page, the higher the final grade.

Academic Analytics
The amount of data stored in educational institutions is gargantuan, and the new term for data collection is Big Data. According to McKinsey Global Institute, the education sector ranks as one of the economy’s  top ten in terms of the amount of data stored. The question becomes, what do we do with it.  At the institutional level, there are numerous opportunities for data analysis where schools can identify many patterns, gaps in student performance is just one example. Arizona State is an example of an institution that uses data analytics extensively, and has done so for several years with sophisticated analysis programs.

However in this post we are looking at the micro level, how the course instructor can affect his or her instruction using the information stored within the course to improve instruction and support students.  I’ve outlined below a few practical suggestions to get started, the basics to analytics.

Practical Applications for Course Instructors
Within virtually all learning management platforms there are reporting features that course instructors can access to display student data. Below are questions instructors may have about the students within their course that the data can answer through various reports that can be generated.

  • Which course resources/tools are being used most frequently? Video clips, posted documents, etc.
  • How often are students logging onto the course?
  • When did the student review the assignment instructions? Submit an assignment?
  • Which discussions boards generate the most traffic – have more students views? This is different from the number of discussion board postings, as many students may view [and read] the posts but not contribute.
  • When was the last time students logged onto the course? How many times per week are students logging on?
  • What are the patterns of performance in online tests? By question?

Learning to Use the Reports
Learning how to use student data is not complicated once you know where to access the information. I’ve included a selection of brief videos below [average time of each is three minutes] all created by course instructors from various institutions that demonstrate how to access student reports in Moodle and Blackboard. In my next post I’ll delve into what student engagement can tell you about your course design, how to adapt instruction to be more effective and how to troubleshoot student problems based upon my experience with analysis of the online courses at my workplace.

Click here for part two of this series, How Course Instructors can Improve Students Engagement with Analytics.

Resources: How To Videos

Photo credit: Big Data, metaroll’s photostream, Flickr