Last month, I attended an event held by Civitas Learning, a provider of learning analytics software, which involved two panel sessions looking at different aspects of implementing learning analytics.
The first session focussed on Learning Analytics and Policy and had representation from Jisc, Buckinghamshire New University and Universities UK. Phil Richards from Jisc opened the panel session by presenting the work Jisc are doing in the sector, which City are involved with. He noted that, unsurprisingly, the main predictor of end of year outcomes is coursework assessment and that Jisc were working on an ‘assessment deadline heatmap visualisation that could be used at programme level to help Programme Directors ensure that assessments aren’t all clustered together. He also talked about the development of an open, anonymised data set based on the Jisc Learning Records warehouse, with data from three universities. Jisc intend to run a competition to see whether any metrics for excellent teaching can be identified from the data. With GDPR looming he raised the legality of learning analytics and reported that no institutions so far have decided that students can opt out of learning analytics.
The panel discussion suggested that learning analytics is approaching a tipping point and it could be risky for institutions not to have an analytics system in place as they could be at risk of failing to use data to ensure student success. The recent suicide cases from the two universities in Bristol were cited as an example where there had been clear flags in the data that there was a potential issue, but that the students had not been known to the universities or community wellbeing services.
How data is used and re-used was a hot topic of discussion, with concerns that some students might be disadvantaged by the use of learning analytics. It was recommended that governance of learning analytics builds trust, norms and standards around the use of the data. In terms of giving students access to the data, there was a suggestion that students might feel empowered or encouraged to ‘play the system’ in order to improve their success rating, but on the flip side this may discourage less confident students. Could we harm their chances of success by giving them the data? Will students ‘play it safe’ with their module choices and thus limit a student’s ability to fail or take risks?
Another aspect of data use looked at who else might want access to learning analytics data. It was suggested that graduate recruiters might ask students to upload their data to prove what type of learner/person they are. We therefore need to consider responsibilities around keeping and sharing the data, and whether students have a right to erasure of their data.
In the second panel session, senior managers from University of East London (UEL), University of West London (UWL) and Northumbria University discussed the roll out of Learning Analytics (in this case Civitas) within their institutions. It was interesting to hear about the different approaches with UWL taking a ‘big bang’ approach by implementing it for all students, whilst Northumbria were more cautious with a proof of concept pilot with one department. Cultural changes around use of learning analytics were mentioned as a challenge, but it was suggested that these could be mitigated against by ensuring that staff understand the benefits of the data and how to use it appropriately. The Civitas solution provides predictive modelling, such that it provides a prediction of student success. It was suggested that academics don’t necessarily need to know how the model works in order to use it. It is much more important that they understand which indicators (e.g. student attendance, VLE usage) may influence student outcomes. UWL noted that attendance was a ‘powerful predictor’ of student success. In many respects, the role of predictive learning analytics is similar to the role of a plagiarism detection system, like Turnitin, such that the system doesn’t tell you that someone has plagiarised, but it does provide you with useful information to flag up potential areas of concern and enable you to have a discussion with the student. This is the same for learning analytics. Personal tutoring was seen to be key to the success of learning analytics, but for some there was a culture change around how to have conversations with students about the data.
Overall it was a useful event and some of the key themes from the panel sessions will feed into our work on the City Learning Analytics Project (LeAP), which is currently moving into its second year.