Skip to main content
Queen Mary Academy

Learner analytics and progress


  Learning is individualised, and our pedagogic development meets the needs of our diverse learners.

(QMUL Strategy 2030)

‘Every time a student interacts with their university – be that going to the library, logging into their virtual learning environment or submitting assessments online – they leave behind a digital footprint’ (Sclater, Peasgood and Mullan 2016, p.4)


Learning Analytics (LA) refers to a wide set of data gathering, storing, and reporting techniques for administrative, programmatic, and pedagogical purposes. This process involves using data about the progress of learners and the contexts in which learning takes place to improve learning and teaching. The information from this data ranges from measuring retention and student progress, to course tool usage, to very granular, personalized, specific student data.


Types of analytics

1. Student progress (high-level data - generally found at the institution level - most traditional and mature data gathering and reporting in higher education) - Sample sources: SIS, Registrar, and other institutional data stores

2. Student behaviour (mid-level data - generally focused on student achievement, such as course activity and grades) - Sample sources: SIS, web server log information, LMS (course tool activity), data collection from tools such as clickers

3. Student Learning (fine-grain data - application-specific data - recording learning events in real time)  combines psychometric techniques, LMS data, and adaptive systems, such as intelligent agents Focused on personalized student learning. Sample source: Games, simulations, intelligent agents

Introducing external linked data, such as demographic or social media sources, can augment each of these approaches.


LA: strengths, weaknesses, opportunities and threats

(Papamitsiou & Economides 2014)

Strengths: include large volumes of available educational data, the ability to use powerful, pre-existing algorithms, the availability of multiple visualisations for staff and students, increasingly precise models for adaptation and personalisation of learning, and growing insight into learning strategies and behaviours

Weaknesses: potential misinterpretation of the data, a lack of coherence in the sheer variety of data sources, a lack of significant results from qualitative research, overly complex systems and information overload

Opportunities using open linked data to help increase compatibility across systems, improving self-reflection, self-awareness and learning through intelligent systems, and the feeding of learning analytics results to other systems to help decision making

Threats: ethical and data privacy issues, “over-analysis” and the lack of generalisability of the results, possibilities for misclassification of patterns, and contradictory findings

According to Gibson and Clarke (2018, p.5), LA can make significant contributions as:

  1. a tool for quality assurance and quality improvement
  2. a tool for boosting retention rates
  3. a tool for assessing and acting upon differential outcomes among the student population
  4. an enabler for the development and introduction of adaptive learning

LA can provide students with an opportunity to take control of their own learning, give them a better idea of their current performance in real-time and help them to make informed choices about what to study.


Using LA to develop pedagogical approaches

Learning analytics involves the use of a broad range of data (ranging from student information systems, learning management systems, or task-specific learning tools). The reporting from this data is descriptive and predictive, and leverages a wide variety of tools, from enterprise business tools to custom algorithms.  It includes techniques for analysis, which includes the development of metrics (such as predicators and indicators for various factors) to understand the current situation and measure teaching and learning effectiveness, the use of different educational technologies to visualise and interpret data and to prompt remedial actions, and the refinement of the metrics and derivation of interventions to shape the learning environment (Gibson and Clarke 2018).

In terms of the daily work of academics, when developing courses or learning materials, it is important to obtain evidence about how useful particular aspects of the course are to learners. The Learning Analytics - Learning Design (LA-LD) Framework (Gunn et al 2017) is a tool designed to help teachers to consider what data they require from learning analytics at different points in the teaching cycle: it seeks to anchor learning analytics data in real-life teaching practice.

Learning Analytics - Learning Design (LA-LD) Framework (Gunn et al 2017) 



Review & Evaluate: measuring and monitoring Student Engagement

(Learner analytics online - VLE platform: QMplus)

Measuring and monitoring student engagement and participation can appear to be much more difficult online; we cannot simply look around a lecture theatre to see who is there, or walk around a seminar room to observe student discussions or group activities.  However, within the asynchronous environment there are a range of different tools and analytics to enable you to monitor which students are accessing, downloading, viewing, or completing activities.  This can help you identify those students who are not engaging or participating so much and you can then follow up with them outside of the virtual learning environment, for example by email.  

Building formative assessment opportunities into the course design also allows you to monitor engagement and check student learning.  This can take many varied forms such as quizzes, submission of draft assignments, presentations (live or pre-recorded) or group work.


Using activity completion in QMplus

QMplus provides a number of features which allow you to guide students through a module and to monitor their progress. At the heart of many of these features is activity completion. This allows you define completion criteria for particular activities with the activity automatically being marked as completed when those criteria are met. It is also possible to set up activities where students manually mark the activity as completed.

Once you have set up activity completion, you can use the activity completion report on your course area to get an overview of how your students are engaging with your activities. Activity completion can also be used in conjunction with other QMplus features such as the completion progress block, course completion, certificates, badges and conditional activities.


Using the checklist and the progress bar to keep students on track: The checklist activity and the completion progress block are features available on QMplus to help your students stay on track and to help you keep an overview of how they are progressing.  The completion progress block works in conjunction with the activity completion feature of QMplus, the checklist activity can be used with or without activity completion.

Using QMplus report: QMplus contains a wealth of data about the actions of both staff and students.  This data is available through the reports menu in a QMplus area.  The standard reports can contain a lot of data, nearly every action a member of staff or a student takes in QMplus is logged.  You are unlikely to have the time or the inclination to go through everything that is available to you.  If you use activity completion, the activity completion report  and the completion progress bar can give you a more targeted overview of how students are progressing with the most important tasks in your module.  

This user guide shows how to set up activity completion in your QMplus area - How to guide: Using activity completion to track student progress -


Using analytics in QMplus Media and Q-Review

Both QMplus Media (Kaltura) and Q-Review (lecture recording) provide information about how your students are using any video content you have made available. You can find out who is watching your videos and how long they are spending on them.  How much of that 50-minute presentation is being watched? If you are running online sessions using Blackboard Collaborate and you are recording them, it is possible to find out how many times a recording has been viewed but not who viewed it or how much of the recording they viewed.

You can look at video usage data from a number of perspectives.  As it is possible to publish a single video in several module areas on QMplus, you might want to see how it has been used in a particular module, or you might want to look at how it has been used in all locations.  This is possible to do through the analytics provided by Kaltura. Q-Review analytics:



Gibson, H. and Clarke, A. (2018) Learning Analytics and Enhancement: A Discussion Paper. Enhancement themes. The Open University.

Gunn, C., McDonald, J., Donald, C., Milne, J., & Blumenstein, M. (2017). Building an evidence base for teaching and learning design using learning analytics. Wellington: Ako Aotearoa - The National Centre for Tertiary Teaching Excellence.

Papamitsiou, Z. & Economides, A. A. (2014) Learning Analytics and Educational Data Mining in Practice: A Systematic Literature Review of Empirical Evidence. Journal of Educational Technology & Society 17, 49–64.

Sclater, N., Peasgood, A. and Mullan, J. (2016) Learning analytics in higher education: A review of UK and international practice. Jisc.

Sclater, N. (2017) Strategic approaches to learning analytics in UK higher education.  Jisc.

Sclater, N. and Mullan, J. (2017) Learning analytics and student success – assessing the evidence. Jisc.