Date of Award
2016
Document Type
Open Access Master's Thesis
Degree Name
Master of Science in Applied Cognitive Science and Human Factors (MS)
Administrative Home Department
Department of Cognitive and Learning Sciences
Advisor 1
Myounghoon Jeon
Committee Member 1
Shane T. Mueller
Committee Member 2
Andreas Riener
Abstract
Nowadays, a driver interacts with multiple systems while driving. Multimodal in-vehicle technologies (e.g., Personal Navigation Devices) intend to facilitate multitasking while driving. Multimodality enables to reduce cognitive effort in information processing, but not always. The present study aims to investigate how/when auditory cues could improve driver responses to a visual target. We manipulated three dimensions (spatial, semantic, and temporal) of verbal and nonverbal cues to interact with visual spatial instructions. Multimodal displays were compared with unimodal (visual-only) displays to see whether they would facilitate or degrade a vehicle control task. Twenty-six drivers participated in the Auditory-Spatial Stroop experiment using a lane change test (LCT). The preceding auditory cues improved response time over the visual-only condition. When conflicting, spatial congruency has a stronger impact than semantic congruency. The effects on accuracy was minimal, but there was a trend of speed-accuracy trade-offs. Results are discussed with theoretical issues and future works.
Recommended Citation
Sun, Yuanjing, "MULTISENSORY CUE CONGRUENCY IN LANE CHANGE TEST", Open Access Master's Thesis, Michigan Technological University, 2016.
Defense_SunApr25_v2.4_Sun_v2.6_Jeon_gradschool hk.docx (1403 kB)
2nd-time submission after formatting review