Files

Download

Download Presentation Slides (565 KB)

Publication Date

2-21-2017

Description

Electric vehicles and automated vehicles are getting more pervasive in our everyday life. Ideally, fully automated vehicles that drivers can completely trust would be the best solution. However, due to technical limitations and human factors issues, fully automated vehicles are still under test, and no concrete evidence has yet shown their functionalities are superior to human cognition and operation. In the Mind Music Machine Lab, we are actively conducting research on connected and automated vehicles, mainly using driving simulators. This talk specifically focuses on multimodal interactions between a driver and a vehicle as well as the driver and nearby drivers. In this autonomous driving context, we facilitate the collaborative driving by estimating the driver’s cognitive and affective states using multiple sensors (e.g., computer vision, physiological devices) and by communicating via auditory and gestural channels. Future works include refining our designs for diverse populations, including drivers with difficulties/disabilities, passengers, pedestrians, etc.

Disciplines

Psychology | Teacher Education and Professional Development

Comments

Special Mobility TechTalks are sponsored by Advanced Power Systems Research Center (APSRC), Keweenaw Research Center (KRC), and Innovation and Industry Engagement Office.

Multimodal interaction in conneted automated vehicles

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.