Charles Sturt University
Charles Sturt University

Writing Exam Questions & Multiple Choice Questions

Examinations can play an integral role in the assessment of a student's ability to demonstrate their knowledge and depth of understanding on any given topic during their course of study. Hence, it is essential that we design examination questions that are fair and valid, yet challenging.

Like other assessment tasks, examinations can be used to gauge both student learning and the efficacy of the learning process. So, we must ensure that the cognitive skills examined during the examination are constructively aligned to the skills required by the subject learning outcomes. The primary aim with examination questions is to provide students with a platform to demonstrate their knowledge and their ability to apply the knowledge in a context relevant to the subject learning outcomes.

Key points to be considered while preparing questions:

What type of response is required?

A fixed response can be explained as an objective response where students select the correct response to a question, or supply a word or short phrase to answer a question or complete a statement. For example, multiple choice, true-false, matching or fill-in-the-blank questions.

A constructed response is a subjective narrative that challenges students to create an original answer. For example: short answer, long answer, essay, or performance test items.

Have you chosen the best test format for evaluating cognitive ability to meet the learning outcomes?

The verbs in the learning outcome can provide direction towards the choice of the question type. Some verbs such as identify, list, and select clearly indicate that students need to select the response. If the question is written in such a way that a student has to reason in order to select a correct response, student actions such as analyse or compare could be included in the selection of answers. Generally, verbs such as analyse, apply, interpret, compare, infer and predict indicate that a student should construct a response.

For instance, if the subject learning outcome is expecting the student to be able to synthesis information; then multiple choice questions (MCQ) would not serve as an appropriate assessment tool. Instead, a long answer question with clarity and emphasis on the importance of the student’s ability to synthesis information should be given. However, having MCQ as an assessment does not always mean that the questions are meant to evaluate lower order cognitive skills. MCQ type questions can be constructed to assess higher order cognitive skills like analyse, compare, judge etc.

To gain a greater understanding of potential student actions for different question types, you could review Bloom's or SOLO Taxonomy.

Other key factors to consider

  • Time to set question papers and mark (how much time you have).
  • Amount of content to cover in a set time (how much time the student has).
  • Number of students taking the test.
  • Diversity of students taking the test.
  • What kind of technology is used for test delivery.

Pros and Cons of short/long answer or essay questions:

Some Advantages:

  • Reveal a student’s ability to reason, create, analyse, synthesise, and evaluate.
  • Gives opportunities for students to demonstrate higher level skills and knowledge.
  • Can assess a student’s writing ability.
  • Less time consuming to prepare than any other item type.

Some Limitations:

  • Can limit the range of content assessed.
  • Favours students who have good writing skills and neatness.
  • Subjective and potentially difficult to moderate.
  • Time consuming to grade.

Guidelines for constructing effective short/long or essay questions:

Focus essay questions

  • Effective essay questions should provide students with a focus (types of thinking and content) to use in their response.
  • Avoid indeterminate, vague questions that are open to numerous and/or subjective interpretations.
  • Select verbs that match the intended learning outcome and direct students in their thinking.
  • If you use ‘discuss’ or ‘explain’, give specific instructions as to what points should be discussed/explained.
  • Delimit the scope of the task to avoid students going off on an unrelated tangent.
  • Review the question and improve using the following questions:
    • Does the question align with the learning outcome?
    • What other verbs could be used for more clarity?
    • Is the scope specific and clear enough?
    • Is there enough direction to guide the student to the expected response?

Pros and Cons of multiple choice questions:

Some Advantages:

  • Highly objective and reliable for testing large cohorts.
  • Can assess student learning of a wider range of discipline knowledge.
  • Reduces marking time.
  • Easy to implement as online test.

Some Limitations:

  • Frequently questions only test lower order thinking skills.
  • Feedback to students is not common on these assessments.
  • Students may acquire false knowledge through lures in the distractors.
  • Writing good questions (and answers) is difficult and takes time.

Guidelines for constructing multiple choice questions:

Anatomy of MCQ:

Multiple choice question items consist of a stem and alternatives.

The stem comprises of the problem while the list of options with one correct or most appropriate (answer) + the incorrect answer (distractor) are called the alternatives (Steven et al., 1991).

General rules for THE STEM

  • The stem is a complete statement and can be answered without looking at the options.
  • Include in the stem any words that might otherwise be repeated in each alternative.
  • Avoid negatives, or use them sparingly, in the stem. In general the stem should be stated in a positive form.
  • Stem should be clear and specific with clear and consistent layout.
  • In testing for definitions, use the term in the stem rather than as an option.

General rules for THE ALTERNATIVES

  • Alternative options are mutually exclusive.
  • The length of the alternative options are about equal (preferably short).
  • Avoid using absolutes such as always, never, and all.
  • Avoid vague, frequency terms such as rarely, and usually.
  • Avoid the use of 'All of the above', 'Both a. and d. above,' and 'None of the above’.
  • Present alternatives in a logical order (chronological, numerical, etc.) but ensure the correct answer is not always a middle value.
  • Grammar should be consistent in stem and alternatives.

General rules for THE ANSWER

  • Only one correct answer is included.
  • The position of the correct answer varies.
  • Avoid convergence problems where the correct answer includes the most elements in common with the other options.

General rules for THE DISTRACTORS

  • All distractors are plausible.
  • Common student misunderstandings have been incorporated in the distractors.

Effective questions

Item Analysis in Blackboard can help determine the effectiveness of the test questions, enabling us to see which questions might need to be revised. It provides:

  1. Difficulty Factor is the ratio of students that answer the question correctly. A good question should reflect the difficulty of the question at varied levels of understanding providing students’ with both weak and strong understanding an opportunity to demonstrate their knowledge.
  2. Discrimination Index compares the top 25% of students with the bottom 25% and discriminates the level of understanding from each question irrespective of language difficulty or random chance.  Discrimination Index applied to an individual question provides information about how students with a higher overall score or a lower overall score have performed on an individual question. In general we would expect the top 25% of students to perform well in all questions.
  3. Pattern of Response needs to be checked. If too many students select a wrong distractor, then this indicates that there might be a misleading word in the question or the distractor. If no student selects a particular distractor, then this indicates that the distractor is too different from the rest of the distractors and a rewrite is suggested to make it an equally attractive choice.

When should you consider reviewing the question?

  • When the Discrimination Index in the Interact2 (Blackboard) Item Analysis is < 0.1, or
  • Difficulty Factor is either > 80% or <30 %, or
  • Pattern of Response indicates revision.

For information on how to use the Interact2 Item Analysis click here.

Footer Script area