A Network Connecting School Leaders From Around The Globe
Multiple-choice tests and quizzes are an effective tool for: (a) assessing a student's mastery of facts and concepts; (b) helping students learn and retain facts and concepts. While some educators might see this as a trick question, the correct answer appears to be: (c) all of the above, suggests new research from Washington University in St. Louis.
"Although people often think about multiple-choice tests as tools for assessment, they can also be used to facilitate learning," said Andrew Butler, a cognitive psychologist in Arts & Sciences who studies the brain processes behind learning and recall. "The act of retrieving information strengthens memory for that information, leading to better long-term retention, and changes the representation of the information, creating deeper understanding."
Butler's study, published in the September issue of Journal of Applied Research in Memory and Cognition, offers straightforward tips for constructing multiple-choice questions that are effective at both assessing current knowledge and strengthening ongoing learning.
Among key findings, educators should never include trick questions or offer "all of the above" or "none of the above" options among the list of possible answers.
Research on the format of multiple-choice questions is important, Butler noted, because the tests are widely used throughout the world, especially in the United States where they originated as part of early efforts to measure intelligence.
Fueled in the beginning by the need for an efficient way to measure characteristics of World War I soldiers and booming student enrollments, multiple-choice tests now influence important life decisions in areas such as college placement, workplace hiring, career advancement and even online dating.
As an associate professor in the Departments of Education and of Psychological & Brain Sciences, both in Arts & Sciences, Butler conducts research that explores the malleability of memory -- the cognitive processes and mechanisms that cause memories to change or remain stable over time.
Taking any form of test has the potential to alter our understanding of a topic, he said, because the process of recalling information requires important details to be freshly reconstructed from related memories.
While multiple-choice testing, especially repeated testing, has the potential to strengthen our recall, a poorly formatted test question can have the opposite effect, Butler said. Such an ill-formed question can muddy our recollection of the correct answer and reinforcing memories for inaccurate "distractor" answers, he added.
Butler's research review confirms that proper question formatting and presentation are critical to creating effective multiple-choice tests. It also suggests that many widely used multiple-choice tests still include lots of questions that fail to comply with research-based best practices.
"Fortunately, the best practices for creating multiple-choice tests that effectively assess understanding are much the same as those for supporting student learning," Butler said.
Butler's study explains the cognitive science behind five research-based recommendations for crafting more effective multiple choice questions:
Finally, because multiple choice questions expose students to lots of plausibly presented false information, it's important for students to review answers after grading is completed. Feedback enables test-takers to correct errors and avoid internalizing incorrect information. It also strengthens learning around correct answers that were low-confidence guesses at test time.
"One takeaway from these recommendations is that the most effective multiple-choice items get students to think in ways that are productive for learning and enable valid measurement of whether they have acquired the desired skills and knowledge," Butler said. "To maximize both effectiveness and efficiency, it is also best to keep the process of answering multiple-choice items simple -- added complexity often has a negative effect on both learning and assessment."
Cite This Page: