On Wednesday 9 November 2011, seven undergraduate students came together for an innovative two-hour afternoon workshop facilitated by Dr Miguel Mera, Dr Ian Pace, and myself (Dr Christopher Wiley) to discuss various issues related to delivery of feedback on the Centre’s BMus programme. While activities of this nature have been undertaken within individual academic modules in the past, this was the first time that a dedicated workshop had been run with the aim of establishing dialogue between staff and students, and working together to maximize good practices.
The students who attended the workshop – Martina Baltkalne, Conaugh Clark, Tim Doyle, and Ruth Ginger from the second year, and Harriet Baker, Alexandra George, and James Perkins from the third year – were firstly given anonymized examples of actual feedback reports written in previous years, which they discussed in terms of the good practices they exemplified and potential for improvement. They were then asked to spend a few minutes designing their ‘ideal’ feedback proforma on a blank sheet of paper, before being presented with six different specimen feedback form templates whose strengths, weaknesses, and preferences they talked through in turn.
Discussion of samples of actual feedback reports
The students were very positive about the Centre’s current feedback practices, and understanding about why it takes time to turn around marking.In the course of discussion of the examples of feedback, the following points emerged:
- Succinct feedback was welcomed. It is possible to write too much feedback, leading to situations in which students cannot see the proverbial wood for the trees.
- Clarity was an essential component of effective feedback. Feedback should not simply note that something was unclear, so much as specifying exactly what was unclear and how the student could have made it clearer.
- Feedback should not merely repeat what the student did in the assessment. A student will already know what they wrote, hence feedback needs to go further than this.
- There was an indifference to the use of bullet points in feedback. While some students felt that bulleted lists facilitate readability, others observed that they can come across as overly direct or prescriptive.
- Preference as to the mode that feedback would take was considered to be largely a matter of personal taste. The students agreed that different learners might like their feedback to be set out in different ways.
- The most suitable mode for feedback was felt to be written report form. The students were not even concerned by the use of the third person as they recognized that feedback should be formal and will be read by third parties (e.g. moderators and external examiners).
The final point was perhaps the most unanticipated, given present trends in pedagogical discourse towards promotion of innovative forms of feedback (audio/podcast, video, dialogic, etc.) as against the more traditional formal written report. Similarly, the use of the third person, which is often frowned upon as being impersonal and not addressed to the student, did not seem to concern them – they did not even notice it until prompted.
These findings do, however, need to be read in conjunction with their observation that a student’s preferred mode of feedback can be quite a personal matter. For instance, perhaps Music students are more accustomed to receiving formal reports written in the third person from the music examinations that they will inevitably have undertaken during their studies prior to the BMus degree.
Discussion of different feedback form templates
The specimen feedback form templates that provided the focus of much of the discussion in the second half of the workshop reflected a variety of different formats for written feedback. The samples comprised versions of our own feedback practices from recent years together with forms in use at other institutions across the UK.
One proforma utilized a series of fifteen tickboxes, which the students found quite prescriptive and were concerned that such an approach could be discharged quite quickly by an examiner and hence might not inspire confidence that due care and attention had been paid to the marking process. Another included a second page of self-assessment to be completed by the student by way of reflection on the feedback received, which our students felt to be a stage too far.
A third approach represented by the set of proformas divided feedback into eight different categories including research, critical engagement, accuracy, structure, and presentation. The students felt this to be too segmented an approach, and hence one of only limited help in terms of enabling them to make connections between these categories. For example, poorly-constructed bibliographic citations might need to be addressed under both ‘presentation’ and ‘accuracy’.
The students also favoured one feature unique to another proforma: the identification on the report form of a date and time at which the marker would be available to meet with the students to discuss their feedback. They felt that this, and one or two other additions, would be useful enhancements to their preferred proforma. By the end of the session, then, the workshop had developed a feedback form in collaboration with, and endorsed by, the students themselves. This form has now been implemented across the BMus programme.
Final Thoughts
The workshop proved to be a positive experience for both staff and students, not just in terms of developing a standardized report proforma for assessments but also in terms of discussing issues of feedback and assessment. Given the obvious benefit of running such workshops, a second session for students of the BMus programme, specifically focussed on the assessment and feedback of music performance, has already entered the planning stages. On behalf of all the facilitators, I should like to take this opportunity to thank the seven students whose presence has enhanced teaching and learning for the 140-strong student community they represented.
In light of the success of this workshop, other programme teams across the University who are considering running a similar event for their own students would be strongly encouraged to do so. If you would be interested in holding such an activity, and for further information, please feel free to contact me, Dr Christopher Wiley (c.m.wiley@city.ac.uk), in my dual role as Director of the BMus Programme in the Centre for Music Studies and Learning Development Associate for Assessment and Feedback in the University’s Learning Development Centre.
Hi Chris,
I read with interest a great example of working with students to enhance feedback and maintain a dial
I thought it might be worth noting here that I facilitated a separate assessment task with some of my first-year undergraduate students last week. We looked at the published assessment criteria for written submissions, then I handed out three sample student essays (all written by me, but aimed at imitating different levels of academic achievement) for a 500-word assessment. We discussed the strengths and weaknesses of each essay in turn, then discussed how we might rank them, and finally talked about what mark we might give them according to the assessment criteria. What struck me most, other than the level of consensus among the students as to which essay was best and which was worst, was their reluctance to give higher marks (70+): this led me to bring up the notion of the ‘glass ceiling’ and why it is important to use the full range of marks. Had we had more time for the session, I would have asked the students to fill out the proforma discussed in my original post for one of the essays, to encourage them to think about the purpose of feedback and why it is beneficial.
meant to finish my comment Chris and say that I like the fact that you always take note of the students and their concerns and then use that to adapt your programme.
Great stuff.