Learning Analytics

, , Leave a comment

I recently participated in a  ELESIG symposium on Learner Analytics. An extended introduction of learning analytics and its uses both for educators and learners has already been covered by my fellow colleagues in his previous post.

This symposium enabled delegates to adopt a critical stance to the ways in which the analytics was managed and owned. Prof Chris Jones, Liverpool John Moores University kicked it off with a review on ‘Learner and student experience in an age of austerity: how is the agenda set?’. This provided an excellent keynote of the big picture – specifically of the relationship between the student learning experience and in an age of austerity.  He got us to think about the new sources of analytics used in Higher Education and how the data is managed and owned – just a few of the cautions and considerations. I was left with the notion that the way we manage the data analytics would be useful to consider in relation to:

  • What is going to happen in the bigger picture?
  • At what level will you undertake data Analytics?
  • Who’s now in charge and who’s data is this?
  • What happens to aggregate data such as google apps etc.?

He summarises his talk in the video below. If you are interested in viewing his keynote, please go to this link.

The keynote was closely followed by two practical examples:

Dr Cath Ellis, University of Huddersfield on ‘Assessment Analytics: should we do it and, if so, what might it look like?’ focused on the risks and benefits of assessment analytics, interrogating assessment and e-learning technologies to harvest data at a more meaningful level. With a range of tools available, she reminded use of the need to use them with care.

In this session, she provided an understanding of what assessment analytics could look like and how we might put it into practice. She suggested that one of the ways assessment analytics is being used in institutions at the moment is to focus on a small section of students and on topics such as retention and attrition when it should be focused on student learning. I liked the way that Huddersfield had been using rubric results and common errors or common strengths that are embedded into the student paper using the grademark tool. This could be a useful approach to consider for schools that use grademark. The big focus for Cath was to get back to the topic of student learning within ethical parameters. Dr Cath Ellis will no longer be in Huddersfield but is returning back to her homeland(Oz) for a new job!

She summarises her talk in the video above. If you are interested in viewing her presentation, please go to this link.

Lastly Prof Luke Dawson, University of Liverpool, introduced us to a tool designed at the University of Liverpool – LIFTUPP (Longitudinal Integrated Fully Transferable Undergraduate Postgraduate Portfolio) – that usefully identifies longitudinal patterns of student’s competencies for learning.  Feedback to students and action plans enables them to see where changes are made. Both students and staff report the tool as beneficial for their learning and teaching.  (Link to 5-min interview below.)

Learning Analytics can be an important part of the work we do in enhancing the learner’s experience and as yet is still one of those topics that needs some further analysis. One of our ELESIG buddies from UCL concluded by provided us with an overview of a briefing pack they had prepared for managers on learning analytics which I have to say is very helpful. Please see her prezi below.

Print Friendly, PDF & Email
 

Leave a Reply

Your email address will not be published. Required fields are marked *