P938: Understanding student responses on surveys

Author: Maria Schroeder, U.S. Naval Academy, USA

Co-Author: Shirley Lin,Debra Dillner and Judith Ann Hartman, U.S. Naval Academy, USA; Diane M. Bunce, Catholic University, USA

Date: 8/6/14

Time: 5:15 PM6:30 PM

Room: LIB

Related Symposium: S33

Written surveys are often a convenient and efficient way of gathering data on student perceptions or choices. Although surveys seem straightforward and objective, interpretation of what is meant by survey questions can differ between the survey writer and the survey taker. Even when interpretation is consistent across these two entities, much richer information is often supplied by the survey taker in a one-on-one interview situation. During the Fall 2012 semester, a subset of freshmen enrolled in the first semester of general chemistry at the US Naval Academy were interviewed regarding their answers on a written survey asking them about the resources they used to study for both instructor-written and multiple choice common exams. Student responses were audio taped and analyzed using NVIVO software to search for insights in what students chose and why they chose it. This qualitative data was analyzed according to whether the resource chosen led to deeper understanding of the topic or provided a quick answer to a question. Differences in study methods chosen for instructor-written exams vs. multiple-choice common exams were investigated. This poster will provide an analysis of the study methods chosen and why they were chosen by a subset of students enrolled in a large general chemistry course where success is expected and study time is limited.