P638: Using eye-tracking studies to evaluate student responses to multiple-choice items
When using multiple-choice items in assessing students on their chemistry content knowledge, there are a variety of reasons for why students may select a particular response option. While the intention of an assessment item is that students’ responses are a valid measure of their content knowledge, there are several test taking issues that can reduce the validity of this inference. Several validity issues may arise based on the format of the item alone. Prior studies have investigated issues of option order effects and how the order of the response options can alter the difficulty of the items. Other studies have probed how the use of particulate-level images, data tables, and graphs or figures can influence student responses. Prior studies of these aspects have focused only on how student performance changes as an item’s structure is altered. This study plans to use a combination of eye-tracking and student interviews to obtain further evidence of the effects of various item features. A variety of research questions and sample items will be presented and discussed. The goal of the studies will be to obtain novel data about how students’ complete multiple-choice items and how various item characteristics might impact the inferences derived from item results.