Beyond first-year composition, the typical undergraduate mechanical engineering curriculum provides few opportunities to develop writing skills without a concerted effort by faculty to incorporate writing into their courses. One underutilized path for BSME students to strengthen those skills is the required sequence of laboratory courses, where students write several lab reports, evaluated by graduate teaching assistants (GTAs), many of whom speak English as a second language. Historically, engineering GTAs have not been trained in evaluating student writing in a way that helps students improve their technical communication skills, a method known as formative assessment. Formative assessment can be a key part of the learning process in that a student produces a product on which an evaluator provides feedback and the student learns from the feedback, “forming” new knowledge (Yorke, 2003, pp. 478-479). Such assessment can be informal (feedback on drafts, immediate responses to student questions or presentations in class) or formal (graded work such as lab reports that GTAs evaluate and return to the students with feedback the students are expected to incorporate into future assignments).
This paper details a comprehensive research study of a GTA training program implemented in a large mechanical engineering department at a small Midwestern public “high research activity” university. Situated within the field of Writing Across the Curriculum/Writing in the Disciplines, the training program was developed to meet the unique needs of the department’s GTAs and address perceived deficiencies in undergraduate student technical writing by teaching best practices in writing evaluation. Two distinct methods were used to assess the efficacy of this program: 1) qualitative methods including interviews and an open-ended survey helped gauge GTA needs and performance as well as assess the value of instructional tools developed to assist GTAs and students and 2) a summative assessment of undergraduate student writing completed prior to when the GTA training program was implemented and afterwards.
In follow-up interviews after completing the program, the majority of GTAs said the training helped them provide higher quality feedback and improve their own writing because they were more aware of issues such as audience and logical flow. The survey showed the undergraduate students found the set of lab report guidelines, developed as part of the program and applying to all three courses, and the corresponding detailed rubric helped them better understand report requirements and expectations. The survey also showed that there was still some inconsistency in grading from GTA to GTA in one course in particular, but that many GTAs were providing detailed feedback that helped students learn. The summative assessment showed improvement in four of five categories measured in the university’s written communication learning goal rubric: Organization and Conventions, Content Development, Sources and Evidence, and Control Syntax and Mechanics. No improvement was shown in Context and Purpose for Writing. Feedback from GTAs and students played an important role in a curriculum redesign that occurred in parallel with the implementation of this training program.
This research is relevant to undergraduate engineering programs seeking to improve the communication skills of their undergraduate students. The training program used limited staff/faculty resources to extend the knowledge and skills of its GTAs and reach all its undergraduate students through existing required courses.
2016 ASEE Annual Conference & Exposition
Extending WID to train mechanical engineering GTAs to evaluate student writing.
2016 ASEE Annual Conference & Exposition,
Retrieved from: http://digitalcommons.mtu.edu/mechanical-fp/61