Short answer items (also called supplied response or constructed response items) are an effective measure of a student's ability to accurately recall specific, target information. Short answer items require students to either complete a statement (fill-in-the-blank or completion items) or answer a direct question using a single word or brief phrase. The nature of supplied response items lends itself well to the effective assessment of lower level learning objectives such as knowledge or comprehension of terms and definitions. As opposed to traditional objective measures (true-false, matching, multiple-choice, etc.) that assess the recognition of correct information, short answer items require students to independently generate their own response. While this type of recall assessment is more cognitively demanding, the independent nature of the responses makes scoring much more subjective. Due to the subjective interpretation of short answer items and the increased challenges in assessing the accuracy of responses, instructors should carefully examine the utility of short answer items in relationship to their instructional objectives.
Questions must be clearly worded so that students understand the nature of the information being requested. In order to facilitate understanding, phrase the item so that the required answer is brief and specific. In addition, provide clear clues to indicate the expected response.
To ensure that a short answer question is an effective measure of student knowledge, require students to fill in important terms or phrases. For example, when assessing understanding of definitions, have students supply the term.
When utilizing short answer questions that require a numerical response, specify the degree of precision that is expected and the relevant units of measurement.
To prevent confusion and make scoring more precise, phrase question so there is only one answer or a limited range of answers possible. If multiple answers will correctly complete the item, ensure that there is a pre-established scoring rubric to deal with variations in response.
Leave information to be filled in at or near the end of the question. This type of arrangement allows for ease of reading and enhances the efficiency of scoring.
Utilize clear, explicit instructions that specify the format of the target answer (one word, multiple words, etc.) as well as the amount of acceptable variation (spelling, synonyms, etc.).
To prevent confusion and ensure requested information is clear, limit the number of blanks within each short answer question. In addition, ensure that blanks are the same physical length to prevent context clues to the correct answer.
Limit the influence of extraneous clues to the correct answer by utilizing correct, neutral grammar. Avoid providing grammatical clues to the correct answer (plurals, "a" versus "an," specific modifiers, etc.) and make certain that all correct responses can fit grammatically in the blank.
To reduce the emphasis on rote memorization of trivial information, do not use direct quotes from the text or lecture. Rather, phrase short answer items using unique or novel wording.
To ensure an accurate measure of target information, use direct questions rather than fill-in-the-blank or incomplete statements. This type of wording reduces confusion or ambiguity concerning the request information and directs students toward the relevant information.
One of the benefits of short answer items is that they often encourage more intensive study of information due to the increased cognitive demands of recall over recognition. To promote this type of invested studying, award more credit for short answer items than for lower level recognition items (true-false, matching, etc.).
While short answer items often target knowledge or comprehension understanding, effectively developed completion items can also be utilized to assess application, synthesis, analysis, and evaluation levels. One means of measuring this type of higher-order understanding is to utilize combinations of short answer statements within a given paragraph. When implementing the paragraph format, be sure that desired knowledge is clearly specified.
Aiken, L. R. (2000). Psychological testing and assessment (10th ed.). Boston, MA: Allyn and Bacon.
Chatterji, M. (2003). Designing and using tools for educational assessment. Boston, MA: Allyn and Bacon.
Gronlund, N. E. (2003). Assessment of student achievement (7th ed.). Boston, MA: Allyn and Bacon.
Johnson, D. W., & Johnson, R. T. (2002). Meaningful assessment: A manageable and cooperative process. Boston, MA: Allyn and Bacon.
McKeachie, W. J. (1999). Teaching tips: Strategies, research, and theory for college and university teachers (10th ed.). Boston, MA: Houghton Mifflin Company.
Popham, W. J. (2000). Modern educational measurement: Practical guidelines for educational leaders (3rd ed.). Boston, MA: Allyn and Bacon.
Trice, A. D. (2000). A handbook of classroom assessment. New York, NY: Addison Wesley Longman, Inc.
Questions concerning the Park University CETL Quick Tips website should be directed to firstname.lastname@example.org.