Please log in

Register now for free

First Name *

Last Name *

Email *

A valid e-mail address. All e-mails from the system will be sent to this address. The e-mail address is not made public and will only be used if you wish to receive a new password or wish to receive certain news or notifications by e-mail.

Every day, universities collect huge amounts of information about their students, ranging from grades and attendance, to library loans and virtual learning environment log-ins.

Until now, however, most UK universities have not done much to bring those data together in a coherent way. It has been US and Australian institutions that have led the development of learning analytics: the use of such information to identify individuals who may be struggling and understand which teaching methods work best.

But that looks set to change, as the UK becomes the first country in the world to get a national learning analytics service, hosted by Jisc, the sector’s main technology body.

Seventy institutions have so far expressed an interest in taking part in the project, which is being piloted ahead of a nationwide roll-out later this year.

Under the scheme, a learning records “warehouse” will collate information about students from university systems and undergraduates’ own record of their learning: how long they spent reading a book, or writing an essay, for example.

A processor will analyse the data, powering an alert-and-intervention system that will highlight students who might be struggling, such as those who appear not to be keeping up with reading or whose attendance is dropping off.

There is emerging evidence that the early interventions enabled by this approach can prove effective. At the University of South Australia, 66 per cent of “at risk” students who were contacted by staff under its own learning analytics programme went on to pass, compared with 52 per cent of the same group who were not approached.

Jisc’s processor will also power a series of "dashboards" allowing staff to view visualisations of the data, with the hope that it will promote a better understanding of “what works”. Manchester Metropolitan University believes that its own learning analytics programme has borne fruit in this regard, attributing a 9 per cent increase in overall National Student Survey satisfaction scores to reorganisation of the curriculum following analysis of student requirements.

Benefits such as these might be available to universities that choose to go it alone on learning analytics. However, Phil Richards, Jisc’s chief innovation officer, said that working together could offer universities economies of scale, plus further benefits.

“Universities have told us that they would like to benchmark their data but, if everyone has got their own system, that’s impossible,” he said. “In order to get benchmarking, you need a shared warehouse; you could average together 10 universities to give you a pool, which would be anonymised, but a useful reference point.”

Big Brother?

All this monitoring might sound a bit like Big Brother, but Jisc is aware of this, having worked with organisations such as the National Union of Students to produce a code of practice for the use of data.

But Jisc believes that most students are supportive, having found in a survey that 71 per cent of learners were happy to share their data, if it could help to improve their grades.

Tim Davies, director of information services at Aberystwyth University, agreed that students would support the scheme, with the right safeguards in place. The university has undertaken attendance monitoring using swipe cards for several years and is now participating in the Jisc scheme with a view to improving student outcomes.

“We have learned that you have to be clear why you are collecting the data, what you are using it for, and who has access to it, ie, only the people who need to act upon it,” Mr Davies said. “The student body has been supportive because they can see its role as a tool for enhancement, both personally and for the university in general.”

One innovative feature of the Jisc scheme is the development of an app for mobile devices that will allow students to track their own progress and, if they wish, the progress of their peers.

A screenshot of the app shows a Facebook-style newsfeed displaying how one student might have spent seven hours in the library over three days, while another might have spent six hours in the lab in a single day, and another might be in the top 10 per cent of their class for an assignment.

Dr Richards said the app aimed to build on the success of fitness apps and wearable technology that has been shown to spur people on while exercising by allowing them to compare their performance with their friends'.

In the Jisc poll, 51 per cent of respondents were against sharing their progress with their peers, but Dr Richards believes that this will change when students understand the technology better. There is evidence that such an approach could be useful: at the University of Maryland, Baltimore County, students who used a tool to compare their virtual learning environment activity with that of their peers were nearly twice as likely to be awarded a grade C or higher.

“It’s a model that is proved to work well for fitness training; let’s see if it works for student learning,” Dr Richards said. “The early signs are that students quite enjoy it.”

With a teaching excellence framework currently being debated in England, it is unsurprising that the learning analytics data are already being looked upon as a potential source of metrics, and Dr Richards said that the project was “hoping to become part of the TEF ecosystem”.

Such metrics could be based around students’ performance or universities’ care for their students. Crucially, however, all the data in the Jisc project will still be owned by the universities and current plans are that they will be available only on an anonymised basis; so the mining of the data for TEF metrics would need institutions’ consent.

The Open University has been the foremost adopter of learning analytics in the UK. However, Bart Rienties, reader in learning analytics at the institution, said he would be sceptical of using the data in the TEF.

“There are obvious benefits if you can start to compare institutions across the board on some simple metrics, but you are really comparing apples with pears,” Dr Rienties said. “You have to wonder about the potential risks, because you have to understand the context that the students are working in.”