P864: Analysis of students’ self-efficacy, interest, and effort beliefs in general chemistry

Author: Brent Ferrell, University of Northern Colorado, USA

Co-Author: Jack Barbera, University of Northern Colorado, USA

Date: 8/6/14

Time: 2:45 PM3:05 PM

Room: MAK B1138

Related Symposium: S59

Research in academic motivation has highlighted a number of salient constructs, which are predictive of positive learning strategies and academic success. Most of this research has centered on college-level social sciences, or secondary school student populations. The purpose of this study was to adapt existing measures of personal interest, effort beliefs, and self-efficacy for a college chemistry context and put them together into one instrument. The instrument was administered at the beginning and end of the fall semester to 294 students enrolled in a first-semester general chemistry course. Confirmatory factor analysis (CFA) was conducted on each subscale and at each time point to assess the degree to which the proposed model fit the data. The parameter estimates and fit indices from the CFAs together with qualitative data from student interviews were used to investigate modifications to the original subscale items. These results led to a decrease in the total number of items on the instrument from 24 to 19. The attenuated sub-scales showed adequate to good model fit, with all fit indices within acceptable ranges. Furthermore, as evidence of concurrent validity, chemistry majors reported higher self-efficacy and interest than other majors. Cronbach’s alpha estimates ranged from 0.76 to 0.91 for the individual sub-scales. With continued use and further validation, this instrument could be a useful tool for assessing general chemistry students’ motivation and the motivational impacts of various teaching practices.

P318: Using evidence based on the response process to support the validity of inferences from educational measures

Author: Jack Barbera, University of Northern Colorado, USA

Co-Author: Brent Ferrell and Paul Schwartz, University of Northern Colorado, USA; David Wren, Wake Forest University, USA

Date: 8/4/14

Time: 4:00 PM4:20 PM

Room: LOH 164

Related Symposium: S25

Instructors and education researchers often rely on assessment instruments when evaluating students. These instruments are typically multiple-choice content assessments (e.g., concept inventories) or evaluations of affective dimensions (e.g., motivation) using a Likert-type self-report survey. Results from these types of assessments are often used to make inferences about the state of students or the impact of teaching practices. However, how do we know what student responses on these measures really mean? Data collected from all assessment instruments should be supported with evidence based on the response process. Response process validity focuses on evidence that supports the meaning of student responses. This talk will focus on the importance of establishing the response process validity of assessment data using examples from a variety of projects within our research group. Data will be presented from interviews collected with students during completion of items from the Chemistry Concepts Inventory and during the evaluation of items to assess aspects of student motivation. On a larger scale, Rasch analysis data will be used to evaluate the response process validity of over 1000 student responses to the Thermochemistry Concept Inventory.