The Ideal versus the Real Deal in Assessment of Physics Lab Report Writing
DOI:
https://doi.org/10.14738/aivp.112.14406Keywords:
Science writing assessment, Physics lab reports, Analytic rubrics, Writing assessment reliability.Abstract
Effective writing is important for communicating science ideas, and for writing-to-learn in science. This paper investigates lab reports from a large-enrollment college physics course that integrates scientific reasoning and science writing. While analytic rubrics have been shown to define expectations more clearly for students, and to improve reliability of assessment, there has been little investigation of how well analytic rubrics serve students and instructors in large-enrollment science classes. Unsurprisingly, we found that grades administered by teaching assistants (TAs) do not correlate with reliable post-hoc assessments from trained raters. More important, we identified lost learning opportunities for students, and misinformation for instructors about students’ progress. We believe our methodology to achieve post-hoc reliability is straightforward enough to be used in classrooms. A key element is the development of finer-grained rubrics for grading that are aligned with the rubrics provided to students to define expectations, but which reduce subjectivity of judgements and grading time. We conclude that the use of dual rubrics, one to elicit independent reasoning from students and one to clarify grading criteria, could improve reliability and accountability of lab report assessment, which could in turn elevate the role of lab reports in the instruction of scientific inquiry.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Rebecca J. Passonneau, Kathleen Koenig, Zhaohui Li, Josephine Soddano
This work is licensed under a Creative Commons Attribution 4.0 International License.