Contents
How to deal with subjective assessment criteria
The problem:
This is a challenge many of us face when marking assessments where we are required to determine if a student’s work is “effective” or their delivery demonstrates the skills of a “good advocate” and ultimately: competent.
On the professional programmes in The City Law School, students undergo training to prepare them for the practise of the law, having addressed the theory at an undergraduate level. This training includes mastering skills such as advocacy, interviewing and advising, writing, and drafting. Now the teaching teams on the professional programmes is not vast, and we each have different skillsets and practice specialisms. However, when conducting assessments, as some of the assessments such as advocacy are performed live, and given the short turnaround times for marking, it’s a very much “all hands on deck” approach that is adopted. We all get involved in marking skills assessments. After all, we were all qualified lawyers at some point, we’re all experienced lecturers, accustomed to marking against assessment criteria and provide feedback.
Let’s take the example of advocacy. To assess the skill, students are given documents about a week in advance and over the course of a week, students are given timeslots of 20 minutes each within which to present their case to the assessor, playing the role of the judge. The advocacy module leader painstakingly prepares guides for tutors, elaborating on each of the assessment criteria, providing examples of how they can each be demonstrated and as assessors, we dutifully read and digest this information that’s set out in a ten page document. This information is then utilised in giving oral feedback to all students for their formative assessments, and written feedback to only those students who did not meet the required competency status in their summative assessments.
After my first day of assessing the live skill of advocacy, I realised that it’s not so easy to refer to that tutor guide to check if a student has demonstrated meeting the criteria whilst the student is presenting. I also discovered that there was a very real danger of me not taking note of what the student was presenting, or mistaking one student’s words for another. This meant I had to playback the student’s recorded presentation, to mark their performance accurately. Obviously, this was both time-consuming and laborious.
The solution:
I recognised there was a real need to simplify the marking process – assessors simply needed access to the key headline points students needed to make to demonstrate each of the assessment criteria. Assessors could simply tick or cross against these points in real time. Assessors could access more information and detail on each of these points outside of the assessment, but a ticksheet for assessors could limit the number of playbacks required to be watched.
The practical solution:
So, I created a ticksheet (example below)
The outcome:
The assessors all agreed that the ticksheet saved time and was an efficient way of marking students’ performance, particularly as it could be used contemporaneously during live assessments. Markers were able to rely on the ticksheets when providing students oral feedback immediately after their formative submissions, and written feedback to those students who did not meet the competency status in their summative assessments.
The criteria were clearly set out, with examples of how they could be demonstrated, which helped those assessors who were unfamiliar with the subject area.
The ability to annotate the ticksheet was a plus.
The ticksheet is now a key document in the markers’ assessment bundle.
Learnings:
Following the successful use of the ticksheet in the advocacy assessment, I trialled it in the other skills assessments of interviewing and advising and legal research. It has had the same consequences as it had for advocacy: simplifying and speeding up the marking process whilst making it easier for non-subject experts to assess the subject.
Recommendations:
If a process is cumbersome for you; it is likely to be cumbersome for others. Identify the challenges; trial a solution, and then adapt it to work for you.
After rolling out the ticksheet, I then considered how I could simplify the feedback process, which was also time-consuming. Students often complained about insufficient or inconsistent feedback.
I looked at the ticksheet inputs of the examples of satisfactorily demonstrating meeting a criteria and how these could be provided in the student feedback in relation to each criteria. The assessor could then delete/insert these examples for the student feedback enabling the message to students to be consistent amongst assessors and constructive, identifying how students’ performance could be improved.