Document Type
Article
Publication Date
2-12-2025
Department
Department of Mechanical and Aerospace Engineering
Abstract
Perception systems for autonomous vehicles (AVs) require various types of sensors, including light detection and ranging (LiDAR) and cameras, to ensure their robustness in driving scenarios and weather conditions. The data from these sensors are fused together to generate maps of the surrounding environment and provide information for the detection and tracking of objects. Hence, evaluation methods are necessary to compare existing and future sensor systems through quantifiable measurements given the wide range of sensor models and design choices. This paper presents an evaluation method to compare colored point clouds, a common fused data type, among two LiDAR-camera fusion systems and a stereo camera setup. The evaluation approach uses a test artifact measured by the fusion system's colored point cloud through the spread, area coverage, and color difference of the colored points within the computed space. The test results showed the evaluation approach was able to rank the sensor fusion systems based on its metrics and complement the experimental observations. The proposed evaluation methodology is, therefore, suitable towards the comparison of generated colored point clouds by sensor fusion systems.
Publication Title
Sensors (Basel, Switzerland)
Recommended Citation
Schaefer, C.,
Kootbally, Z.,
&
Nguyen, V.
(2025).
Quality Evaluation for Colored Point Clouds Produced by Autonomous Vehicle Sensor Fusion Systems.
Sensors (Basel, Switzerland),
25(4).
http://doi.org/10.3390/s25041111
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p2/1470
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Version
Publisher's PDF
Publisher's Statement
Copyright: © 2025 by the authors. Licensee MDPI, Basel, Switzerland. Publisher’s version of record: https://doi.org/10.3390/s25041111