Designing a collection of analytics to explore "engagement"

I’m working with a group of fellow teacher educators here at USQ to explore what is happening around student engagement with our online courses. It’s driven by the apparent less than stellar responses on the QILT site from our prior students around “engagement”. It’s also driven by some disquiet about the limitations of aggregated and de-contextualised data like that reported on the QILT site and also that arising from most learning analytics (e.g. as found by Gašević et. al. (2015).

what data/analytics might be useful for ourselves and our students; and

what’s happening in our courses, and how that compares to what we thought was going on.

As the person most familiar with technology and learning analytics, I’m tasked with identifying the sequence of “increasingly specific learning analytics” that we’ll use.

What follows is a first draft. I’m keen to hear suggestions and criticisms. Fire away.

Specific questions to be to be answered

Does the sequence below make sense?

Are there other types of analytics that could be usefully added to the following and would help explore student/staff engagement and/or perhaps increase contextualisation?

What literature exists around each of these analytics, where did the apply the analytics, and what did they find?

Process overview

Each of the ovals in the following diagram are intended to represent a cycle where some analytics are presented. We’ll reflect on what is revealed and generate thoughts and questions. The labels for the ovals a short-hand for a specific type of analytics. These are described in more detail below.

The sequence is meant to capture the increasing contextualisation. The first four cycles would use fairly generic analytics, but analytics that reveal different and perhaps more specific detail. The last two cycles – learning design and course specific – are very specific to each course. The course specific cycle would be aimed at exploring any of the questions we identified for our individual courses as we worked through the other cycles.
It won’t be quite as neat as the above. There will be some iteration and refinement of existing and previous cycles, but the overall trend would be down.

The analytics below could also be compared and analysed a variety of ways, most of which would be responding to details of our context. e.g. comparisons against mode and specialisation etc.

Click/grade & Time/grade

This cycle replicates some of the patterns from Beer et al (2010) (somewhat shameless, but relevant self-citation) and related. This is aimed at just getting the toe in the water, getting the process set up. It’s also arguably perhaps as removed from student learning/engagement as you can get. A recent post showed off what one of these will look like.

This would also include the heatmap type analysis such as the following diagrams.

Networks and paths

These analytics focus on the relationships and connections between people and the paths they follow while studying. Moving beyond numbers to starting to understand connections.

This 2013 post from Martin Hawksey (found via his other post mentioned below) gives an overview of a range of uses and tools (including SNAPP) for social network analysis. It’s the early SNAPP work that identifies some of what these visualisations can help identify

isolated students

facilitator-centric network patterns where a tutor or academic is central to the network with little interaction occurring between student participants

The following is one of my first attempts generating such a graph. It shows the connections between individual student blogs (from EDC3100 2013). The bigger the line between dots (blogs), the more links.

A Sankey diagram is a method for representing flow in networks. It can be used to understand usage of websites. Martin Hawksey has just written this post (showing how to take LMS discussion data and send it through Google analytics) which includes the following screen shot of “event flow” (a related idea). It shows (I believe) how a particular use has moved through a discussion forum. Looks like it provides various ways to interact with this information.

Hoping we might be able to leverage some of the work Danny Liu is doing.

Sentiment, content, and broader discourse analysis

The previous cycles are focused on using clicks and links to understand what’s going on. This cycle would start to play with natural language processing to analyse what the students and teachers are actually saying.

This is a fairly new area for me. Initially, it might focus on

readability/complexity analysis;
Unpublished work from CQU has identified a negative correlation between the complexity of writing in assignment specifications and course satisfaction.

sentiment analysis
How positive or negative are forum posts etc? The comments and questions on this blog post about a paper using sentiment analysis on MOOC forums provides one place to start.

Learning design

The plan here is to focus explicitly on the learning designs within the courses and explore what can be revealed using checkpoint and process analytics as outlined by Lockyer et al (2013).

Course specific

Not explict planned here. The idea is that the explorations and reflections from each of the above cycles will identify a range of additional course specific questions that will be dealt with as appropriate.