Tips for Writing Good Multiple-Choice Questions

  • Base each item on an educational or instructional objective of the course, not trivial information.
  • Try to write items in which there is one and only one correct or clearly best answer.
  • The phrase that introduces the item (stem) should clearly state the problem.
  • Test only a single idea in each item.
  • Be sure wrong answer choices (distractors) are at least plausible.
  • Incorporate common errors of students in distractors.
  • The position of the correct answer should vary randomly from item to item.
  • Include from three to five options for each item.
  • Avoid overlapping alternatives.
  • The length of the response options should be about the same within each item (preferably short).
  • There should be no grammatical clues to the correct answer.
  • Format the items vertically, not horizontally (i.e., list the choices vertically)
  • The response options should be indented and in column form.
  • Word the stem positively; avoid negative phrasing such as “not” or “except.” If this cannot be avoided, the negative words should always be highlighted by underlining or capitalization: Which of the following is NOT an example ……
  • Avoid excessive use of negatives and/or double negatives.
  • Avoid the excessive use of “All of the above” and “None of the above” in the response alternatives. In the case of “All of the above”, students only need to have partial information in order to answer the question. Students need to know that only two of the options are correct (in a four or more option question) to determine that “All of the above” is the correct answer choice. Conversely, students only need to eliminate one answer choice as implausible in order to eliminate “All of the above” as an answer choice. Similarly, with “None of the above”, when used as the correct answer choice, information is gained about students’ ability to detect incorrect answers. However, the item does not reveal if students know the correct answer to the question.

From Writing Good Multiple Choice Questions by Dawn M. Zimmaro, Ph.D.

How Many Options Should a Multiple-choice Question Have? Maybe Just 3

Exactly how many options should a multiple-choice question have? The answer has varied over the years, but one meta-analysis suggests fewer than many of us currently use. As recently as 2002, researchers suggested we use “as many plausible distractors as feasible,” but that may mean just 3, according to Michael C. Rodriguez in “Three Options Are Optimal for Multiple-Choice Items: A Meta-Analysis of 80 Years.” 

Rodriguez writes, “I would support this advice by contributing the concern that in most cases, only three are feasible. Based on this synthesis, MC items should consist of three options, one correct option and two plausible distractors. Using more options does little to improve item and test score statistics and typically results in implausible distractors. The role of distractor deletion method makes the argument stronger. Beyond the evidence, practical arguments continue to be persuasive.

  1. Less time is needed to prepare two plausible distractors than three or four distractors.
  2. More 3-option items can be administered per unit of time than 4- or 5-option items, potentially improving content coverage.
  3. The inclusion of additional high quality items per unit of time should improve test score reliability, providing additional validity-related evidence regarding the consistency of scores and score meaningfulness and usability.
  4. More options result in exposing additional aspects of the domain to students, possibly increasing the provision of context clues to other questions (particularly if the additional distractors are plausible).”
We may not feel comfortable moving from 4 or 5 options to 3, but the message is clear, there’s no reason to spend valuable faculty time and energy on developing non-plausible distractors, and more than 5 options does NOT improve a question.

How to Increase the Value of Tests

  • Incorporating frequent quizzes into a class’s structure may promote student learning. These quizzes can consist of short-answer or multiple-choice questions and can be administered online or face-to-face. … Providing students the opportunity for retrieval practice—and, ideally, providing feedback for the responses—will increase learning of targeted as well as related material.

  • Providing “summary points” during a class encourages students to recall and articulate key elements of the class. Setting aside the last few minutes of a class to ask students to recall, articulate, and organize their memory of the content of the day’s class may provide significant benefits to their later memory of these topics. Whether this exercise is called a minute paper or the PUREMEM (pure memory, or practicing unassisted retrieval to enhance memory for essential material) approach, it may benefit student learning.

  • … Pretesting students’ knowledge of a subject may prime them for learning. By pretesting students before a unit or even a day of instruction, an instructor may help alert students both to the types of questions that they need to be able to answer and the key concepts and facts they need to be alert to during study and instruction.

  • Finally, instructors may be able to aid their students’ metacognitive abilities by sharing a synopsis of these observations. … Adding the potential benefits of pretesting may further empower students to take control of their own learning, such as by using example exams as primers for their learning rather than simply as pre-exam checks on their knowledge.

ITEM ANALYSIS: Evaluating Multiple Choice Questions

CVM faculty receive information about the quality of their tests and quizzes several ways.

  • They may look at student performance data on particular tasks, activities, quizzes, or tests in Carmen.
  • They may be notified of item analysis generated when they administer Scantron tests.
  • They may review a “Test and Question Report” from ExamSoft, a secure-testing application available to all faculty and currently used across first-year core courses.

The latter two are specifically designed to validate exam reliability, consistency, and quality.

These formal and informal processes allow us to create strong assignments and assessments, refine components of those assessments over time, and align the way we assess students with the learning outcomes identified.

Continue reading ITEM ANALYSIS: Evaluating Multiple Choice Questions

How to Review Test Reports and Multiple Choice Item Analysis

CVM faculty receive information about the quality of their tests and quizzes several ways.

• They may look at student performance data on particular tasks, activities, quizzes, or tests in Carmen.
• They may be notified of item analysis generated when they administer Scantron tests.
• They may review a “Test and Question Report” from ExamSoft, a secure-testing application available to all faculty and currently used across first-year core courses.

The latter two are specifically designed to validate exam reliability, consistency, and quality.

These formal and informal processes allow us to create strong assignments and assessments, refine components of those assessments over time, and align the way we assess students with the learning outcomes identified. Continue reading How to Review Test Reports and Multiple Choice Item Analysis