Examining the Effectiveness of Different Assessments and Forecasts for Accurate Judgments of Learning in Engineering Classes
Document Type
Conference Proceeding
Publication Date
2022
Department
Department of Cognitive and Learning Sciences
Abstract
Research and anecdotal evidence suggests that students are generally poor at predicting or forecasting how they will do at high-stakes testing in final exams. We hypothesize that better judgments of learning may allow students to choose more efficient study habits and practices that will improve their learning and their test scores. In order to inform interventions that might provide better forecasts, we examined student data from several university engineering courses. We examined how well three types of assessments predict final exam performance: performance on ungraded exercises; student forecasts prior to exams, and performance on graded material prior to exams. Results showed that ungraded exercises provided poor forecasts of exam performance, student predictions provided marginal forecasts, as did prior graded work in the course. The best predictor was found to be performance on previous high-stakes exams (i.e., midterms), whether or not these exams covered the same material as the later exam. Results suggest that interventions based on prior exam results may help students generate a better and more accurate forecast of their final exam scores.
Publication Title
International Conference on Computer Supported Education, CSEDU - Proceedings
ISBN
9789897585623
Recommended Citation
Cischke, C.,
&
Mueller, S. T.
(2022).
Examining the Effectiveness of Different Assessments and Forecasts for Accurate Judgments of Learning in Engineering Classes.
International Conference on Computer Supported Education, CSEDU - Proceedings,
1, 497-503.
http://doi.org/10.5220/0011124500003182
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p/16516
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Publisher's Statement
© 2022. Publisher’s version of record: https://doi.org/10.5220/0011124500003182