Learning Analytics for Instructors Need to be Loud and Clear

Learning Analytics…less data more insight. Analytics primary task is not to report the past, but to help find the optimal path to the desired future. (Shum, 2012)

Learning analytics, [analyzing students’ online behaviour patterns to support learning improvement] is not about data collection, but helping learners and instructors make connections with the data. I attended a webinar this past week with Campus Technology, Grand Canyon University: How we are improving student outcomes using LoudAnalytics on the LoudCloud Ecosystem. Grand Canyon University of Arizona shared results from their learning analytics pilot project using LoudAnalytics from LoudCloud, a company which presents themselves as a learning ecosystem, the next-generation of learning management systems. In this post I identify what kind of analytic reports are essential and the most useful to course instructors, which are not, and why this is so. The findings in this post I gathered from the webinar and content from week four of the course Current/Future State of Higher Education.

Meaningful’ Data for the Instructor
I wrote a post last week that addressed how student data gathered from online behaviours from a school’s platform, can put the learner in the ‘driver’s seat’, essentially in control of his or her own learning. A dashboard which gives real-time info on a student’s ‘stats’, can be a visual tool to help learners reach their goals, identify problems and contribute to  motivation. However, what about the course instructor? What analytic tools are available through the LMS platform that can provide meaningful data, data that is consumable – in a usable form that encourages instructors to take action in real-time?

Grand Canyon University Webinar,  Slide #14

To the left is an example of a report from LoudAnalytics that displays data about students’ progress in a visual format. Students are represented by circles; the size of the circle representative of the hours spent on the course home page (interacting with course content, etc.) and the colour of each circle representing a letter grade. I see this as a ‘snapshot’ view of  students progress holistically, but don’t see this report on its own as providing ‘actionable’ data. Time spent within the LMS does not translate always to grades and engagement level, but is just one metric.

Grand Canyon University Webinar, Slide #47

The report to the right however, does appear to provide constructive data for the course instructor. When instructors consider the previous report and the one here, the instructor is able to do something with it. For example upon review, the instructor might want to reach out to student #2 (and potentially one or two others) with an email to the student that might read like this:

Dear [name of student], it appears that you have an assignment outstanding, and have not participated in a forum recently. I am concerned about your progress in the class. There are several resources available for support, …..”

There are limitations to this scenario I’ve described here, it is one-dimensional given we don’t have complete information, but the idea is that the indicators provided in this report are specific about student actions, or non-actions that give the instructor something to work with.

What Data is NOT Helpful
It is information about student actions, i.e. missing assignments, non-participation in discussion forums, low test grades, that is valuable for instructors, what I call ‘actionable’ data. Other data, such as number of times logged on to the course  home page, or the number of minutes spent within the platform, is not meaningful or of much practical use. I suggest that platform providers (i.e. Moodle LoudCloud etc.) consider generating reports that are focused and specific to the users needs (users defined within three groups: student, instructor and administrator). However, making too many reports available will detract from the value of the analytics. For example, the report below shows the time in minutes a student spent within the LoudCloud system, which gives a snapshot of student behaviour, but, I don’t see how this information is useful for the instructor. Perhaps it might be, if considered in conjunction with other reports, but then we get into data overload.

Grand Canyon University Webinar, Slide #48

Furthermore, just because we can measure something, doesn’t mean it is valuable or even useful. Another example is the program that Course Smart, the e-textbook provider is launching to give course instructors reports on student engagement. I wrote about this last week, yet I use this again as an example to show how reports are created from data that end up being inconsequential.

It [Course Smarts’ program] will track students’ behavior: how much time they spend reading, how many pages they view, and how many notes and highlights they make. That data will get crunched into an engagement score for each student. The idea is that faculty members can reach out to students showing low engagement (Parry, 2012).

I have a hard time imaging how instructors will use this information. The problem from the get-go is that Course Smart assumes that student engagement is defined by the number of electronic ‘notes’ made in the e-book and how long the student spends ‘reading’ the textbook. Not only is this logic flawed, but as one of my readers pointed out, it has a ‘big brother’ feel about it. I do agree, and I will be writing about the ethics of learning analytics next week.

Closing Thoughts
Learning analytics can be a powerful tool for instructors, yet only when meaningful data is compiled in such a way that it is user-friendly, relevant and actionable, in other words reports must be loud and clear.  LoudCloud is onto something here, I very much like their visual presentation. Yet LoudCloud and other LMS providers need to narrow down the number of analytic reports made available, customizing what they offer to the users needs. Make it clear, specific and meaningful.

Next post: Dream or Nightmare: The Ethics of Learning Analytics, Online Learning Insights

Grand Canyon University: How we are improving student outcomes using Loud Analytics on the Loud Cloud Ecosystem. (November 13, 2012) Campus Technology Webinar (now on demand)

LT-C2012 Learning Analytics Symposium, (2012),  Simon Buckingham Shum, Slideshare
Introduction to Learning and Knowledge Analytics Syllabus, (2011), An Open Course
Putting Learners in the Driver’s Seat, Online Learning Insights