How Course Instructors can Improve Student Engagement with Learning Analytics

Online learning is dynamic, active, at times disorganized, yet with the effective use of tools, instructors can adapt and adjust instruction to create a rich learning experience. This is part two in a three-part series on learning analytics.

Learning analytics is a powerful tool that can help instructors adapt their course to create an engaging, robust learning environment. Analytics, the newest tool to  improve student learning, is a branch of the latest application of Big Data, used exclusively at one time for big-box retailers such as Wal-Mart, Home Depot and Fortune 500 companies, has now become mainstream in education. Learning analytics in the education sector collects data whenever students log-on to their institutions learning management platform [LMS] such as Blackboard or Moodle. Each student click known as a ‘view’ in Moodle, is associated with a time stamp, a record of the time students spent with each resource.  Analytics involves the mining of  data, analysis and reporting, which creates [potentially] useful information about student learning. The Society for Learning Analytics and Research (SoLAR), dedicated to exploring the role of analytics in teaching and learning defines it this way,

Learning analytics is the measurement, collection, analysis and reporting of data about learners and the contexts for purposes of understanding and optimizing learning and the environments in which it occurs.  SoLAR

But notice earlier I stated potential; I believe that the data alone does not tell a complete story. In this post I’ll share how course instructors can use analytics as a tool to improve  engagement and quality of learning outcomes by examining activity data and student outputs [writing assignments in this case]. I use an example from my workplace to show how we: 1) assessed ‘data’ from LMS reporting tools, 2) identified how students interacted with course content through learning activities, and 3)  made decisions for course design for this course and others based upon the results.

How to Use LMS to Analyze Course Design
Within a LMS, [Moodle in this example] reports are available that show how students engage with each resource by ‘views’ (clicks) for the entire class or even by student. A resource, for the purpose of this post, is defined as course content [web page, outside link, videos clips, etc] or an activity [discussion board, wiki project etc.] In this post I used a course from my workplace  as the example, an online history course for credit with eight students.

We examined student patterns of interaction with content and application through the class discussion boards and learning activities. Learning activities consisted of students interacting with content, in the form of pre-selected web resources which provided students with an interactive learning experience requiring exploration and examination of primary sources.  Discussion boards consisted of student responses to instructor questions which addressed text-book reading and/or content from pre-recorded lectures. Below is a screen shot of one of the weeks within the course [week five] activity report, which includes the collective views by students for each resource.

Screen shot a section of a Moodle report which displays the number of student views each resource had collectively for the given week.

In each instance we examined the quality of the students’ application, which for each, required students to write two or three paragraphs describing what they learned [through interaction with content either web resource or text/video]. Instructions are below for one part of the content exploration.

Activity that focuses on the content, the Influenza Epidemic, one topic within week’s content.
The application portion of the learning activity from above.

When comparing this activity to others within the course and discussion boards, the Influenza Epidemic received the highest number of views. Upon closer analysis [with another report], students spent more than one visit on this activity, and invested more time.

Next, we examined the quality and quantity of students’ responses to the various activities, it appeared students were spending time with the content in the above activity –  and it is evident from reading student responses how effective the activity was.  This is important – the numbers or data alone did tell the complete story, it was this in conjunction with the quality of students responses that we were able to conclude that interactive activities with quality content, requiring students to interact with primary sources (in this instance), with the requirement of a written product, produces the best results.

Conclusions from Analysis:

  • Students are more likely to engage with content when the activity involves interactive resources. The activity then becomes an effective pedagogical method when students are required to produce or create an output after reflection and analysis.
  • Number of views on discussion boards can be an indicator of level of interest of a given topic. This differs from the number of postings, as some students may read the postings but not create a post.
  • Identifying which activities not only engage students but produce quality analysis – where students use higher order thinking skills, is critical to designing and adapting online courses.
  • Reviewing reports in conjunction with student outputs is essential to gain a holistic perspective on student performance.

The example I share here is only a fraction of the tools and reports available to analyze and examine student behaviours.  Investing time in learning how to use the tools most effectively is time well spent for the online course instructor; time that can lead to well designed courses resulting in application of critical thinking skills and positive learning outcomes.


Photo Credit: Magnifying Glass, by Rafael Anderson Gonzales Mendoza