Putting Learners in the Driver’s Seat With Learning Analytics

Steering WheelI read something disturbing this week from Inside Higher Ed, Measuring Engagement with Books

“The big buzz in higher ed is analytics,” said senior vice-president of marketing Cindy Clarke, of the e-textbook provider Course Smart. “Based on the issues there are with institutions around improving the return they’re getting on their investment in course materials, we realized we had a valuable data set that we could package up [emphasis added].” (Tilsley, 2012)

Coincidently, last week’s topic in the course I am taking Current/Future State of Higher Education (#CFHE12) was Learning Analytics, the same topic this article refers to. It’s a promising area of study and is a ‘hot’ topic in higher education right now. Data, in the form of students’ online behaviours obtained by measuring clicks, keystrokes, time logged-on, number of ‘hits’ [visits] on web pages, is collected and then compiled into ‘meaningful’ information.

Yet Course Smart’s [in my opinion] program is an example of learning analytics gone awry. The ‘packaging up’ as mentioned by Ms. Clarke refers to the program Course Smart developed with data on students’ reading patterns. The program looks at how students interact with the e-textbooks, the number of times a student views a page and for how long, highlights made, etc. Course Smart compiles this ‘data’ and sends a Student Engagement Report to professors.  Are these metrics a true measure of a student’s level of engagement?  It seems that student engagement covers a far broader scope than time spent reading a textbook.  Even if the report did provide meaningful indicators, how would an instructor actually use it to teach more effectively?

Analytics in higher education is considered by some to be a panacea to its woes. Yet it’s  complex, sensitive, and almost onerous given the abundance of student data that is collected by institutions. In this post I’ll give an overview of the three areas of analytics, micro, meso and macro to provide clarity and context as explained by a guest presenter in a recent CFHE12 webinar, Simon Buckingham Shum, Associate Director of Knowledge Media Institute, Open University, UK. I’ll also share how learning analytics is used to help students learn as discussed by educator Erik Duval, Professor at Katholieke Universiteit Leuven, Belgium, during another #CFFE12 webinar, and finally how educators can use analytics to help students take charge and ownership of their learning, essentially put them in the driver’s seat.

The Big Picture of Student Data
When we speak of learning analytics, we are at the ground floor of Big [academic] Data. Data analytics in academia has the potential to support decision makers, academic researchers, policy makers, instructors and students. Simon Shum described the layers of analytics this way:

What Questions Should Data Answer?
At the macro level, institutions share data with others, compare and determine what is useful to influence and support decisions on educational funding models, policies and for International comparisons. At the institutional level [meso level], analysis includes examining student progress and related data to make programming decisions, identifying at-risk students by predicting student performance, and making curricular decisions. Shum’s approach to analysis is holistic, he puts forth questions that educators should be asking when making decisions about how to use data effectively at the meso and micro level:

  • What are we measuring and why?
  • What problem are we trying to solve with the data?
  • What level of results should we share with the learner?
  • What are the ethical considerations?
  • How can we create a functioning ecosystem that uses data effectively and responsibly?
Dashboard of student progress displayed throughout a course. Designed by students in a research project with Duval.

Analytics in the Real Time – Helping Students
Duval, not only a professor, is chair of the research unit on human-computer interaction analytics, and his research focuses on how data can provide valuable feedback to the learner. He gathers and analyzes student behaviour patterns through online behaviours as it relates to their learning, which he shares with his students, there is 100% transparency. Information for the learner comes in several forms, one of which is a dashboard that provides a snapshot view of how the student is doing at a given point in the class. This view is designed to trigger self-reflection where learners can view their progress, compare their performance with others in the class. When asked how students respond to viewing others’ performance, Duval says he spends considerable time at the beginning of the class explaining, discussing and reviewing the purpose of the reports, how to use them and what they mean to students’ overall learning.

This approach is also in ‘real time’, it is actionable — students [and instructors] have access to feedback as the course progresses. Students can adjust, make decisions, and take action as needed. Instructors can also reach out to struggling students, ‘intervene’ with resources and support. Duval describes this as putting the learner in the driver’s seat, which is the name of the conference held in Belgium recently on Learner Analytics.

How can Instructors use Learning Analytics at the Ground Level?
There are questions and concerns about how much data students should have access to, with FERPA guidelines and privacy issues, educators must tread carefully. That being said, we can begin by asking the right questions: what do we want students to achieve, and how can data help them? What do we need to do to educate students about the data, how they can use it?

What can Educators do?

  • Identify what tools are available within your learning management system for data analysis. I’ve included in previous posts YouTube videos that provide instruction on how to use LMS tools, and other strategies for analytics. See the resources section for the links.
  • Be part of the discussion with faculty and administrators about learner analytics – ask questions, focus on ‘why’.
  • If presented with analytical tools or reports, determine how they can be used to support learners through instruction or intervention.
  • Become familiar with how analytics can help instructors be more effective in helping students learn.
  • Review  programs based upon analytics that are used at other institutions: Purdue’s Course Signals ProgramUniversity of Michigan Academics Analysis Program and Community College at Rhode Island program,Connectedu.

Closing thoughts
Learning analytics has tremendous potential for education, though I am cautiously optimistic about its use in higher education. I am far from an expert, but I see the value in giving students ownership of their learning through tools provided by analytics,  dashboards for example, similar to Duval’s. We need to involve the student in this conversation – it’s not the data that’s the solution to the challenges that higher education is facing, it’s the students. Let’s put them in charge, give students the tools to make the decisions to make learning meaningful, put them in the driver’s seat.


Conference, Learners in the Drivers Seat, Belgium
How Instructors Can Use Analytics to Support Student Engagement, Online Learning Insights
SoLAR, Society for Learning and Analytics Research
Learning and Knowledge Analytics, Resources
Engage: Test by ACT to predict student success in college
Erik Duval’s Slideshare, Presentations
Understanding Learning Academics and Student Data, MindShift