LiDAR and RGB camera based feature extraction and evaluation for adverse weather driving

Document Type


Publication Date



Department of Applied Computing


Modern automobiles are commonly equipped with a variety of perception modalities, i. e. sensors, to implement various levels of Advanced Driver Assistance Systems (ADAS). The main sensing modalities include camera, light detection and ranging (LiDAR), radar, as well as inertial measurement and global positioning devices. In addition, ultrasonic and acoustic sensors complete the possible perception modalities. ADAS features autonomous driving systems that have been initially developed assuming ideal perception, i.e., weather conditions; however, in reality, the quality of a sensor modality may degrade significantly due to adherent dirt, dust, fog, wet weather, or changing illumination conditions on the road. It is important to quantitatively study the degradation of sensor modalities in non-ideal conditions such as fog, rain, snow, bright sunlight, dust, etc., in order to evaluate sensor degradation and its effects on higher-level functionalities such as lane-keeping, object detection, recognition, and tracking, as well as ego-motion estimation and localization. This paper quantitatively examines the variability of several LiDAR and RGB camera-based features in variable weather conditions, including clear sky, overcast, fog, rain, and snow. Specifically, we derived LiDAR point-cloud-based features such as mean and standard deviation of distances and point cloud size. Also, image color space features were examined, in addition to blurriness and structural detail. Furthermore, we perform feature selection to explore the importance of various combinations of features for the prediction of road weather conditions.

Publication Title

Proceedings of SPIE, Autonomous Systems: Sensors, Processing, and Security for Ground, Air, Sea, and Space Vehicles and Infrastructure 2023