Assessing the Quality of Multiple Choice Questions

I am often asked about how one knows whether a multiple choice question is “good” or not. Expertise in the field always makes the final decision, but there are guidelines and statistics that can provide very helpful support in writing and refining multiple choice questions.

Nikole Hicks, PhD, RN, CNE, wrote a Fairness of Items tool (FIT) to guide writing and assessment of multiple choice questions. Read more about her development of this tool. I have her permission to share it with OSU College of Nursing faculty, so email me (Joni) to obtain a copy.

If you are somewhat familiar with statistics and need a quick guide to quiz item analysis, refer to this PDF from Anne Schoening, PhD, RN CNE.

A more detailed explanation of item analysis of quiz questions is presented in this article McGahee and Ball (2009). You can access the full text of the article through the OSU Health Sciences Library.

As always, please consult with the your instructional design experts on the CON-IT team if you need additional assistance with question writing and evaluation.

Choose the Best Answer:  (a) Multiple Choice Quizzes or (b) Specification Grading

If you’ve ever felt the tension between multiple-choice tests and more complex assessments of learning, you are not alone.  Read this EdSurge article on specification grading and its potential advantages over multiple choice exams for student assessment.

The grading tools in CarmenCanvas might make specification grading an efficient approach to engaging your students and finding out more about the course content they have learned and can apply to problems.  The article also makes a case for the right time and way to use multiple choice tests.  If you would like to explore CarmenCanvas tools (rubrics, SpeedGrader, etc.) for specification grading, please contact the mailto:con-it@osu.edu for more information.

Creating a Self-Grading Quiz on H5P

In a previous entry, you learned how to create a set of flashcards on H5P. Flashcards are an excellent study tool, but some students may simply memorize the cards themselves rather than actually learning the underlying concepts. For that reason, an excellent tool to reinforce the material on the flashcards is the self-grading quiz. Below is an example quiz based on this flashcard set:

To create an interactive quiz of your own, go to the H5P content creation screen and select “single choice set” from the drop down menu.

The first dialogue box will set the title for the entire quiz.

In these dialogue boxes, you will fill out the question and up to four possible answers. The form will default to two possible answers. You must click the grey “add answer” button to create new blank answer dialogue boxes. It is important to note that the first dialogue box is for the answer that you want the quiz to grade as correct. It is also important to note that the quiz will randomize the order of all four possible answers. This will be important later.

Question 5 is an important example because I chose to include an “all of the above” style answer. However, it is important to note that even though this answer is the last one on the form, it will not necessarily appear as the last question within that answer set (eg: it could appear answer 1, 2, 3, all of the above OR 1, all of the above, 3, 4 etc.). For this reason, you should choose a wording similar to “all answers are acceptable” and avoid answers that make reference to other answers in terms of their location within the answer set.

This image shows the grade ranges you can choose. This section starts relatively blank. To create the grade ranges for this quiz, I clicked the blue “add range button” until there was one grade range per question, then clicked the white “distribute evenly button.” However, If you choose to, you can manually adjust the grade ranges. The text boxes next to each grade range are the messages that will appear if a student receives a given score.

In these final steps you can further customize the behavior of the quiz and the messages and prompts that appear on the quiz. For this tutorial, these settings have been left as default, however I encourage you to play with these settings and contact CON IT for any additional assistance you may need. As with other content on H5P, you can edit the download, embed, and copyright buttons that will appear. Once you are happy with your quiz, click the pink save button. If you followed these directions, you should end up with a quiz identical to the one at the beginning of this blog post. Once you have completed your quiz, it can be embedded into Carmen or into your u.osu blog for use as a study tool. For help configuring your quiz, or assistance in implementing an H5P quiz in your classes or study groups, please contact CON IT for additional assistance.

Fairness of Items (FIT) tool for multiple choice exams

Pearls of wisdom from the STTI 43rd Biennial Convention

My series of posts on the recent STTI Biennial continues with a summary of Nikole Hicks’ presentation titled,

“Are Your Multiple-Choice Tests “FIT”?  Using the Fairness of Items Tool (FIT as a Component of the Test Development Process”

In short, Nikole did an exhaustive review of the literature to learn about best practices in test item writing with a focus on nursing education.  She distilled the guidelines into 38 criteria to determine whether a single multiple choice test question is fair and unbiased.  She rigorously tested her FIT tool with nursing faculty and found it to be valid and reliable.  Read the full description of her study background, methodology, results and conclusions on the STTI conference web site.

Her list of 38 criteria can be used to evaluate a single multiple-choice test question, or they can be used to guide test question writing.  They are divided into four categories:

  • evaluate the stem
  • evaluate the options
  • linguistic/structural bias
  • cultural bias

The criteria include recommendations regarding how many distracters to include, words and phrases to avoid, and page formatting, among many other things.  She recommends that nurse educators use the FIT tool to write original questions and revise publisher test bank questions to improve student success and better prepare students for licensure exams.

I found Nikole’s presentation to be very interesting, and the tool has the potential to be very useful in ensuring fairness of multiple choice exams.  Many thanks to Nikole for doing the work of combing through the extensive body of literature to condense item-writing best practices into a practical set of guidelines we can really use.   I have permission from Nikole to share the tool with OSU College of Nursing faculty, and I plan to offer a Flash Friday session on the tool in the spring semester.

Students can review their Carmen quiz submissions!

Students!  Are you looking for the spot where you can view your Carmen quiz results?

Your instructor must first set up a Submission View in Carmen for you to view your quiz.  Depending on the settings your instructor chooses for this Submission View, you may be able to see the questions you got right and/or wrong, the answers you submitted, and the correct answers.To view your quiz submission, go to your Carmen course and to the quiz you want to view.  Click on the small drop-down arrow to the right of the quiz title, and choose Submissions.ViewQuizSubmission-1f3lyqx