Class tests: moving from in-class to online

In 2015, after reviewing the assessment requirements for undergraduate students on my second year module in English phonetics and phonology, I decided to change my practice.  Up until then there had been two elements: 1) a broad phonetic transcription from dictation; and 2) a written exam in the exams period in which students chose two essay type questions from a list of six options.  The transcription assessment took place in the last class of the Autumn term, which is when the module is taught.  The exam took place in the Summer term exams period, which meant students had to wait up to five months after the module had ended to be assessed in it.

I should mention here that the university I currently work at – the University of Reading – has three terms: two eleven-week terms, each with a reading week in the middle; and one eight-week term, which is mostly for examinations.  While there are other universities which adopt this system, some have a semester system, in which students are examined at the end of each semester in a dedicated exams period.

It is also worth mentioning that this – and the equivalent grammar module – are 10-credit (5 ECTs) “skills” modules. All other modules on the programme are 20-credits (10 ECTs).

Marks on my module and on the grammar module were always lower on average than most of the other modules taught in the second year, which tended to be assessed by coursework. The English Grammar module was similarly taught in the Autumn and assessed by an exam in the Summer term.  The grammar lecturer and I decided to alter the assessment timing to the last week of Autumn term. In my case, that meant both the dictation and class test would be in the last week of term.

Another thing we decided to do was move to more questionnaire-based assessments. I also wanted to ensure I was assessing the learning of a range of skills.  The exam paper had allowed students to answer two questions and avoid certain topics; it was not enabling students to demonstrate their learning in a wide enough range of the skills, in my opinion.  I changed the paper to have the following elements:

  1. Ten multiple choice questions on segments.
  2. Ten true/false statements drawn from any aspect of the module materials.
  3. One question on intonation in which students had to suggest a pattern on a phrase in context. E.g., in the exchange “A: I saw him with a brown coat on. B: It was a blue coat”, students would be asked to annotate B’s response with appropriate intonation notation.
  4. One diagram question, in which students were asked to, e.g., complete and label a partially-drawn sagittal section for a particular sound, draw a phases-of-articulation diagram for a plosive consonant, and so on.
  5. One question on vowels, which contained multiple sub-questions.  Students were asked to complete a blank vowel chart diagram by labelling it appropriately and placing three vowels from the RP/ Southern Standard British English vowel system on the chart.  They then had to describe each vowel in terms of front/back, close/open and rounded/unrounded, and provide a sample word which was not one we had used as an example in the module materials.
  6. One question on any other aspect we had covered in the module.  This could be asking them to define a term (e.g., assimilation) and give examples in transcription.

Pleasingly, the external examiner was very keen on this type of assessment in comparison with the previous iteration. He agreed with me that this was enabling students to demonstrate knowledge and understanding across the topics taught on the module and praised the design of the questionnaire.

In the Autumn term of 2020-21, it became evident that we would have to move assessment online, owing to the pandemic.  This gave me something of a challenge.  While our virtual learning environment, Blackboard, has the functionality to set up online tests, I had only used the multiple choice function in the past for e.g. formative quizzes and so on.  I had seen the true/false questions and short answer questions functions, but was unsure how I was going to be able to “do” diagrams.

Then I discovered the hot-spot option.

A hot-spot question allows you to upload an image and select an area within that image that is representative of the correct answer.  Students view the image and click on the location they deem to be the right one.  As long as they have clicked within the area I have set up as being representative of the correct answer, they gain points.

This proved to be a life-saver.  I used it in the following ways:

  • Instead of having students complete or produce a diagram (4), I gave them a range of images which were similar and asked them to select the right one by clicking on the relevant image.  For example, I might give them the description “voiceless alveolar fricative” and ask them to select the correct sagittal section from a group of 4, which also showed /t/, /z/ and /n/. I followed up with short answer “theory” questions which checked their understanding of various relevant facts – e.g., position of the velum, voicing, stricture, etc.
Four slightly different sagittal section diagrams
Sagittal section images for the hot-spot diagram question
  • For the vowels question (5), I presented students with three blank vowel charts and asked them to click on the position of each of three vowels. Follow-up short answer questions enabled students to identify labels for the chart (e.g., close-mid; back), give me a term to describe each vowel and provide example words.

Students were given a period of 23 hours within which to complete the questionnaire part of the class test.  In order to minimise students collaborating with each other, the test was timed to one hour, meaning students had an hour to complete it once they had started it (students with extra time in exams had this added to the hour). I also used Blackboard’s function of randomising some of the question and answer sets for some sections (multiple choice; true/false) and picking from question pools for others, so the students did not all have the same question set.

I will admit that marks were higher on this take-home online version of the test than they had been for the paper version, which is sat under examination conditions in class.  This is understandable, given students doing online tests “at home” have access to a range of resources.  However, there was still a reasonable spread of marks, none of the short answers students gave to questions were identical (!), and I had one student who failed it.  I am satisfied that the test still enables students to demonstrate their knowledge and understanding, and rather pleased with the new skills I have learned this year in using online tests.

Leave a Reply

Your email address will not be published. Required fields are marked *