Learning Analytics for Instructors Need to be Loud and Clear

Learning Analytics…less data more insight. Analytics primary task is not to report the past, but to help find the optimal path to the desired future. (Shum, 2012)

Learning analytics, [analyzing students’ online behaviour patterns to support learning improvement] is not about data collection, but helping learners and instructors make connections with the data. I attended a webinar this past week with Campus Technology, Grand Canyon University: How we are improving student outcomes using LoudAnalytics on the LoudCloud Ecosystem. Grand Canyon University of Arizona shared results from their learning analytics pilot project using LoudAnalytics from LoudCloud, a company which presents themselves as a learning ecosystem, the next-generation of learning management systems. In this post I identify what kind of analytic reports are essential and the most useful to course instructors, which are not, and why this is so. The findings in this post I gathered from the webinar and content from week four of the course Current/Future State of Higher Education.

Meaningful’ Data for the Instructor
I wrote a post last week that addressed how student data gathered from online behaviours from a school’s platform, can put the learner in the ‘driver’s seat’, essentially in control of his or her own learning. A dashboard which gives real-time info on a student’s ‘stats’, can be a visual tool to help learners reach their goals, identify problems and contribute to  motivation. However, what about the course instructor? What analytic tools are available through the LMS platform that can provide meaningful data, data that is consumable – in a usable form that encourages instructors to take action in real-time?

Grand Canyon University Webinar,  Slide #14

To the left is an example of a report from LoudAnalytics that displays data about students’ progress in a visual format. Students are represented by circles; the size of the circle representative of the hours spent on the course home page (interacting with course content, etc.) and the colour of each circle representing a letter grade. I see this as a ‘snapshot’ view of  students progress holistically, but don’t see this report on its own as providing ‘actionable’ data. Time spent within the LMS does not translate always to grades and engagement level, but is just one metric.

Grand Canyon University Webinar, Slide #47

The report to the right however, does appear to provide constructive data for the course instructor. When instructors consider the previous report and the one here, the instructor is able to do something with it. For example upon review, the instructor might want to reach out to student #2 (and potentially one or two others) with an email to the student that might read like this:

Dear [name of student], it appears that you have an assignment outstanding, and have not participated in a forum recently. I am concerned about your progress in the class. There are several resources available for support, …..”

There are limitations to this scenario I’ve described here, it is one-dimensional given we don’t have complete information, but the idea is that the indicators provided in this report are specific about student actions, or non-actions that give the instructor something to work with.

What Data is NOT Helpful
It is information about student actions, i.e. missing assignments, non-participation in discussion forums, low test grades, that is valuable for instructors, what I call ‘actionable’ data. Other data, such as number of times logged on to the course  home page, or the number of minutes spent within the platform, is not meaningful or of much practical use. I suggest that platform providers (i.e. Moodle LoudCloud etc.) consider generating reports that are focused and specific to the users needs (users defined within three groups: student, instructor and administrator). However, making too many reports available will detract from the value of the analytics. For example, the report below shows the time in minutes a student spent within the LoudCloud system, which gives a snapshot of student behaviour, but, I don’t see how this information is useful for the instructor. Perhaps it might be, if considered in conjunction with other reports, but then we get into data overload.

Grand Canyon University Webinar, Slide #48

Furthermore, just because we can measure something, doesn’t mean it is valuable or even useful. Another example is the program that Course Smart, the e-textbook provider is launching to give course instructors reports on student engagement. I wrote about this last week, yet I use this again as an example to show how reports are created from data that end up being inconsequential.

It [Course Smarts’ program] will track students’ behavior: how much time they spend reading, how many pages they view, and how many notes and highlights they make. That data will get crunched into an engagement score for each student. The idea is that faculty members can reach out to students showing low engagement (Parry, 2012).

I have a hard time imaging how instructors will use this information. The problem from the get-go is that Course Smart assumes that student engagement is defined by the number of electronic ‘notes’ made in the e-book and how long the student spends ‘reading’ the textbook. Not only is this logic flawed, but as one of my readers pointed out, it has a ‘big brother’ feel about it. I do agree, and I will be writing about the ethics of learning analytics next week.

Closing Thoughts
Learning analytics can be a powerful tool for instructors, yet only when meaningful data is compiled in such a way that it is user-friendly, relevant and actionable, in other words reports must be loud and clear.  LoudCloud is onto something here, I very much like their visual presentation. Yet LoudCloud and other LMS providers need to narrow down the number of analytic reports made available, customizing what they offer to the users needs. Make it clear, specific and meaningful.

Next post: Dream or Nightmare: The Ethics of Learning Analytics, Online Learning Insights

Resources:
Grand Canyon University: How we are improving student outcomes using Loud Analytics on the Loud Cloud Ecosystem. (November 13, 2012) Campus Technology Webinar (now on demand)

LT-C2012 Learning Analytics Symposium, (2012),  Simon Buckingham Shum, Slideshare
Introduction to Learning and Knowledge Analytics Syllabus, (2011), An Open Course
Putting Learners in the Driver’s Seat, Online Learning Insights

6 thoughts on “Learning Analytics for Instructors Need to be Loud and Clear

  1. Paul Capicik

    Some of the metrics mentioned are close but don’t take the next step that could make them of value. Take for instance 2 items mentioned – highlighting and note taking. I agree that those 2 things by themselves is not necessarily of value. But take that one step further – have the instructor highlight and take notes on what s/he thinks is important. Then have the LMS compare what the student is doing against what the instructor did – now you have something of value (unless the student highlights everything).

    If during a certain course block the LMS alerts the instructor that the student’s highlighting and note keywords do not match what the instructor is focusing on, the instructor can intervene to get the student on-track with what is important. In the absence of the automated capability, instructors could look at student e-book engagement either before or after a test (if the student gives permissions to those annotations) and work with those students having difficulty. Or if most of the students are having difficulty with certain subject matter, the instructor might need to modify their teaching focus on the important points.

    It is great to see groups working on this part of the learning equation.

    Like

    Reply
    1. onlinelearninginsights Post author

      Hi Paul,

      You present an interesting idea, which I see has having potential to be of great value to the instructor. Your example illustrates the pivotal role of the instructor. In isolation the reports and indicators have little meaning and value, but when used as tools by the instructor, then we are getting somewhere, as you pointed out. Your post also highlights another key point, instructors need to have adequate training and development in how to use these reports effectively, how to discern what is important, and what resources are available to the student and instructor for support. Knowing there is a problem is the first step (poor student performance), taking action to fix it is the next.

      These are good conversations to have! Thank you Paul for sharing your ideas and generating discussion. I hope that other educators are finding value in considering and pondering these issues in learning analytics.

      Debbie

      Like

      Reply
  2. Pingback: Debbie Morrison: Learning Analytics for Instructors Need to be Loud and Clear | Doc D Says…

  3. Ilan Elson-Schwab

    Yes totally agree. “Focusing on the customer” and the customer/user journey are really key to making useful products in any industry. This problem has to be approached from the direction of “whose problem is being solved”, and not “how much data can I show off”.

    Like

    Reply
    1. onlinelearninginsights Post author

      Ilan, absolutely, and yet companies try to be ‘everything to everyone’ by offering as much data (or product) as possible in an effort to show ‘value’. Yet value is only defined by the customers’ needs, as you suggested – ‘problems’ that need to be solved by each will vary. The challenge for LMS providers I would imagine, is the cost involved in customizing reports for each institution, by user group. Perhaps one way to address this would be for specific training to be offered to each user group on which reports to use and why. Thanks for your comment. Debbie

      Like

      Reply

Leave a comment