Constructing effective online assessment

I have worked alongside my academic colleagues Isabelle Marcoul and Svenja Erich of the Centre for Language Studies at City University London for the last two years to help develop effective online assessment. This project has now been written up for the recently published Learning at City Journal Vol 2 (2). You can download a full copy of our article here for free.

I’m providing a summary of the article here, focusing on the way the technology was used and how we measured the effectiveness of a multiple choice Moodle quiz.

Background

City University London runs a programme of language modules, some for course credit, some are extra curriculum. The languages taught are French, Spanish, Mandarin, Arabic and German. Before they can join a class the students need to be assessed and assigned to the language course appropriate to their level of linguistic competence, ranging from beginner to advanced levels. In 2011 more than 1000 students took a diagnostic test.

Prior to 2011, the language tests were handed out in a printed format and marked by language lecturers. The administrative burden for this was heavy with very tight marking deadlines, a lot of administrative work to assign students to the correct course, communicate this to students etc. It was concluded that an online system would help automate this, ensure the students received immediate feedback about which level and class was appropriate to them and would speed up the administrative process.

Practicalities

Each year the university runs a Language Fair during Freshers week. Traditionally this was when students took the written test and completed questionnaire (to gather basic information e.g. degree course etc). In September 2011 this assessment was done via multiple choice quiz on Moodle, the questionnaire was also online in a googleform. This meant that

  • a computer room was needed for the language fair
  • an audio/visual component was deemed to be difficult to manage as a large number of headphones would be required so listening was not part of the test

Design of the test

The languages team wanted to assess different types of language ability while being restricted to using a multiple choice online system. Each language had a test comprising of 100 questions. Please see the article for a full description of the choice of question type and what was assessed.

As a learning technologist I was very interested in how the languages department wrote their multiple choice questions in order to assess different types of language ability. For example, students were asked to read a generic text in the source language and were given comprehension questions to see how much they had understood. Some of the questions also asked that the students not only understand the words but also the cultural context and concept in order to get the answer right.

e.g.

What would you like as a main course?
A sorbet with strawberries
Six oysters
Steak and kidney pie with chips

To answer this question students needed to demonstrate understanding of it and the choices and to pick the correct answer from their own knowledge.

In the article Isabelle writes about how we construct language and how we can assess higher order thinking skills using online assessment methods so please do access the article if you are interested in this.

Use of Moodle and googleforms

City University London uses Moodle as it’s virtual learning environment. This was seen to be the perfect platform for the language testing. I met with the lecturers that would be preparing the questions for the test and explained how the Moodle quiz tool worked. This was to help them understand the types of question that would and would not be appropriate.

Once the questions had been written we had a two hour hands-on training session where the staff were trained in using Moodle quiz and then used it to add their questions with my support. I would recommend this approach. It meant that I could immediately troubleshoot any problems and the staff involved have been successfully using Moodle quiz ever since.

We also needed to collect some personal data from the students e.g. name, degree course etc. We used a googleform for this as they are very easy to set up and the data can be exported in excel format which the administrator requested.

Effectiveness of the language diagnostic multiple choice test

Effectiveness of the test was measured by the number of students that stayed in the group/level they were identified as during testing i.e. the language level of the course matched the language level that the student tested at. We were very pleased to see that the test proved very accurate in determining level for French, German and Spanish (small numbers of students took Mandarin and Arabic so the data was not conclusive).

This shows that an online test can effectively measure language ability in the majority of cases with very little movement of students between levels.

You can download a copy of the full article here

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>