Abstract
The increasing and diversifying student enrollments in introductory physics courses make reliable, valid, and usable instruments for measuring student skills and gains ever more important. In introductory physics, in addition to teaching facts about mechanics, we also seek to teach our students the skills of “thinking like a physicist,” or expertise in and intuition for physical problem solving. How and when these expert, intuitive problem-solving skills emerge during a STEM education, or what the most effective teaching methods might be, are not certain. A facile survey to measure students’ “physics-thinking” skills in a pretest and post-test format is therefore desirable to measure and evaluate different pedagogical approaches. Prior investigators codified these skills as “epistemic games” (e.g., order-of-magnitude estimation, evaluating extreme cases) and developed and validated the math epistemic games survey (MEGS) to measure students’ ability to employ these techniques. The original survey instrument is reliable and valid but has drawbacks in its length and in students’ ability to recall questions between administrations. We employed factor analysis to split the MEGS into two mutually exclusive subtests and measured them to be equivalently reliable and valid as the full-length MEGS as originally formulated. The “split MEGS” is well suited for use as a pretest and post-test instrument to measure gains in expertise in problem solving in introductory physics courses.
- Received 17 June 2022
- Accepted 5 September 2023
DOI:https://doi.org/10.1103/PhysRevPhysEducRes.19.020152
Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.
Published by the American Physical Society