Extrinsic Radar Calibration with Overlapping FoV and Hitch Ball Position Estimation

Document Type


Publication Date



Sensor fusion, in many perception algorithms, requires detections from multiple sensors to be transformed onto a conveniently chosen coordinate system for joint processing. The position and orientation of the sensors need to be determined for the fusion procedure. Two automotive radars installed in the taillight fixtures of a truck are considered in this work and their orientation is defined with respect to the straight line connecting them. We extrinsically calibrate the radar geometry by estimating the rotation (mount angle) and translation parameters that are needed to transform the detections from the radars onto a coordinate system whose origin is at the truck's hitch ball. This coordinate system is a convenient choice for algorithms which use radars, installed in similar locations, to track or sense the rotation of an attached trailer about the hitch ball. The calibration is performed by rotating a trailer or a platform, upon which corner reflectors (CRs) are placed, about the hitch ball in the direction of both the radars. The algorithm is based on two principles: 1) the use of common detections found in the overlapping field of view (FoV) of the radars to estimate the rotation parameters and 2) a search for the center of trailer/platform rotation to estimate the translation parameters, which define the hitch ball position relative to the radars. It is shown that the use of more radar detections tends to increase the calibration accuracy. The data collected in an experiment result in estimation errors of about 0.20° and 3 cm for the rotation and translation parameters, respectively.

Publication Title

IEEE Transactions on Instrumentation and Measurement