Putting Learners in the Driver’s Seat With Learning Analytics

Steering WheelI read something disturbing this week from Inside Higher Ed, Measuring Engagement with Books

“The big buzz in higher ed is analytics,” said senior vice-president of marketing Cindy Clarke, of the e-textbook provider Course Smart. “Based on the issues there are with institutions around improving the return they’re getting on their investment in course materials, we realized we had a valuable data set that we could package up [emphasis added].” (Tilsley, 2012)

Coincidently, last week’s topic in the course I am taking Current/Future State of Higher Education (#CFHE12) was Learning Analytics, the same topic this article refers to. It’s a promising area of study and is a ‘hot’ topic in higher education right now. Data, in the form of students’ online behaviours obtained by measuring clicks, keystrokes, time logged-on, number of ‘hits’ [visits] on web pages, is collected and then compiled into ‘meaningful’ information.

Yet Course Smart’s [in my opinion] program is an example of learning analytics gone awry. The ‘packaging up’ as mentioned by Ms. Clarke refers to the program Course Smart developed with data on students’ reading patterns. The program looks at how students interact with the e-textbooks, the number of times a student views a page and for how long, highlights made, etc. Course Smart compiles this ‘data’ and sends a Student Engagement Report to professors.  Are these metrics a true measure of a student’s level of engagement?  It seems that student engagement covers a far broader scope than time spent reading a textbook.  Even if the report did provide meaningful indicators, how would an instructor actually use it to teach more effectively?

Analytics in higher education is considered by some to be a panacea to its woes. Yet it’s  complex, sensitive, and almost onerous given the abundance of student data that is collected by institutions. In this post I’ll give an overview of the three areas of analytics, micro, meso and macro to provide clarity and context as explained by a guest presenter in a recent CFHE12 webinar, Simon Buckingham Shum, Associate Director of Knowledge Media Institute, Open University, UK. I’ll also share how learning analytics is used to help students learn as discussed by educator Erik Duval, Professor at Katholieke Universiteit Leuven, Belgium, during another #CFFE12 webinar, and finally how educators can use analytics to help students take charge and ownership of their learning, essentially put them in the driver’s seat.

The Big Picture of Student Data
When we speak of learning analytics, we are at the ground floor of Big [academic] Data. Data analytics in academia has the potential to support decision makers, academic researchers, policy makers, instructors and students. Simon Shum described the layers of analytics this way:

What Questions Should Data Answer?
At the macro level, institutions share data with others, compare and determine what is useful to influence and support decisions on educational funding models, policies and for International comparisons. At the institutional level [meso level], analysis includes examining student progress and related data to make programming decisions, identifying at-risk students by predicting student performance, and making curricular decisions. Shum’s approach to analysis is holistic, he puts forth questions that educators should be asking when making decisions about how to use data effectively at the meso and micro level:

  • What are we measuring and why?
  • What problem are we trying to solve with the data?
  • What level of results should we share with the learner?
  • What are the ethical considerations?
  • How can we create a functioning ecosystem that uses data effectively and responsibly?
Dashboard of student progress displayed throughout a course. Designed by students in a research project with Duval.

Analytics in the Real Time – Helping Students
Duval, not only a professor, is chair of the research unit on human-computer interaction analytics, and his research focuses on how data can provide valuable feedback to the learner. He gathers and analyzes student behaviour patterns through online behaviours as it relates to their learning, which he shares with his students, there is 100% transparency. Information for the learner comes in several forms, one of which is a dashboard that provides a snapshot view of how the student is doing at a given point in the class. This view is designed to trigger self-reflection where learners can view their progress, compare their performance with others in the class. When asked how students respond to viewing others’ performance, Duval says he spends considerable time at the beginning of the class explaining, discussing and reviewing the purpose of the reports, how to use them and what they mean to students’ overall learning.

This approach is also in ‘real time’, it is actionable — students [and instructors] have access to feedback as the course progresses. Students can adjust, make decisions, and take action as needed. Instructors can also reach out to struggling students, ‘intervene’ with resources and support. Duval describes this as putting the learner in the driver’s seat, which is the name of the conference held in Belgium recently on Learner Analytics.

How can Instructors use Learning Analytics at the Ground Level?
There are questions and concerns about how much data students should have access to, with FERPA guidelines and privacy issues, educators must tread carefully. That being said, we can begin by asking the right questions: what do we want students to achieve, and how can data help them? What do we need to do to educate students about the data, how they can use it?

What can Educators do?

  • Identify what tools are available within your learning management system for data analysis. I’ve included in previous posts YouTube videos that provide instruction on how to use LMS tools, and other strategies for analytics. See the resources section for the links.
  • Be part of the discussion with faculty and administrators about learner analytics – ask questions, focus on ‘why’.
  • If presented with analytical tools or reports, determine how they can be used to support learners through instruction or intervention.
  • Become familiar with how analytics can help instructors be more effective in helping students learn.
  • Review  programs based upon analytics that are used at other institutions: Purdue’s Course Signals ProgramUniversity of Michigan Academics Analysis Program and Community College at Rhode Island program,Connectedu.

Closing thoughts
Learning analytics has tremendous potential for education, though I am cautiously optimistic about its use in higher education. I am far from an expert, but I see the value in giving students ownership of their learning through tools provided by analytics,  dashboards for example, similar to Duval’s. We need to involve the student in this conversation – it’s not the data that’s the solution to the challenges that higher education is facing, it’s the students. Let’s put them in charge, give students the tools to make the decisions to make learning meaningful, put them in the driver’s seat.


Conference, Learners in the Drivers Seat, Belgium
How Instructors Can Use Analytics to Support Student Engagement, Online Learning Insights
SoLAR, Society for Learning and Analytics Research
Learning and Knowledge Analytics, Resources
Engage: Test by ACT to predict student success in college
Erik Duval’s Slideshare, Presentations
Understanding Learning Academics and Student Data, MindShift

6 thoughts on “Putting Learners in the Driver’s Seat With Learning Analytics

  1. I think you and Erik Duval have it right that analytics are useful for informing learners about their learning performance. Self-regulated learning can be supported by providing such feedback. Data from clicks, keystrokes, and time-logged-in, to my mind, are relatively unimportant; they can be influenced by so many factors that their interpretation is subject to doubt. The other thing analytics can be useful for, however, is the formative evaluation and revision of the instruction. How well does the instruction for a given objective result in positive learning outcomes? How well do alternate forms of the instruction improve learning performance in areas of instructional weakness? These are just the starting point for analyzing learning outcomes. Moreover, when an organization’s learning management system data is integrated with data from it’s performance management system, as I intend to observe during the next few months, transfer of learning from training to the workplace should inform the effectiveness of instructional examples, activities, and assessment items relative to one-the-job performance targets. I wonder if measures of performance in the higher education realm could be integrated similarly, such as retention, completion, and subsequent employment.


    1. HI David,
      Higher education institutions generate volumes a data, yet the challenge is determining what to measure and why. I think your idea of using analytics to evaluate effectiveness of instruction in a formative model is important, this allows the instructor to adapt to the learners. And, I think that summative evaluation can be effective for course design, and instructional methods in terms of measuring the impact on student performance, or outcomes achieved, though the challenge here is identifying what data one would use for measuring student performance. Would it be student grades, course completion rates or course satisfaction surveys? Or could it be all three? I am not familiar enough with this area to know how this would work, yet I know the challenges now in higher ed institutions is, how do we measure student performance? Is traditional testing a true measure of student learning? Complex! I don’t know the answers to this. I’d be interested to follow your findings from workplace training and transfer. Thanks for commenting David! Debbie


  2. I am currently enrolled in a course on learning theories and in it I have been studying the importance of metacognition. As an educator, trainer, instructional designer, manager, or in any role where an individual is involved in facilitating learning there is a need to teach the learner how to be more actively involved in their own learning in order for them to achieve a higher level of thinking. The use of analytic tools when used in the right context and with support and guidance seems as though it would be valuable to the learner as well as the facilitator.

    I found the materials that you have presented quite interesting as I was unaware of the use of analytic tools in an academic environment. I work in the corporate sector and we use different analytic tools to measure performance metrics. Often times the tools are not used as effectively as they could be for methods of coaching employees on how to improve. The information that you presented on what educators can do seems beneficial in the context of an academic environment, but I appreciate that I can also apply some of the information on reflecting on how to better utilize the tools in the environment that I am working in right now.


    1. Hi Amber: It is interesting how the various sectors approach analytics with the volumes of data that is now being accumulated. One of the problems in any sector, as you mentioned, is that some of the information generated is of little, or questionable use. IBM, traditionally a company that deals with business entities is becoming involved in the higher ed area given the similarities of data management. Glad you found the article helpful in some way. I enjoyed reading your viewpoint on academic analytics from the perspective of someone in the business environment. Debbie


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s