TY - GEN
T1 - Autocalibration of lidar and optical cameras via edge alignment
AU - Castorena, Juan
AU - Kamilov, Ulugbek S.
AU - Boufounos, Petros T.
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/5/18
Y1 - 2016/5/18
N2 - We present a new method for joint automatic extrinsic calibration and sensor fusion for a multimodal sensor system comprising a LIDAR and an optical camera. Our approach exploits the natural alignment of depth and intensity edges when the calibration parameters are correct. Thus, in contrast to a number of existing approaches, we do not require the presence or identification of known alignment targets. On the other hand, the characteristics of each sensor modality, such as sampling pattern and information measured, are significantly different, making direct edge alignment difficult. To overcome this difficulty, we jointly fuse the data and estimate the calibration parameters. In particular, the joint processing evaluates and optimizes both the quality of edge alignment and the performance of the fusion algorithm using a common cost function on the output. We demonstrate accurate calibration in practical configurations in which depth measurements are sparse and contain no reflectivity information. Experiments on synthetic and real data obtained with a three-dimensional LIDAR sensor demonstrate the effectiveness of our approach.
AB - We present a new method for joint automatic extrinsic calibration and sensor fusion for a multimodal sensor system comprising a LIDAR and an optical camera. Our approach exploits the natural alignment of depth and intensity edges when the calibration parameters are correct. Thus, in contrast to a number of existing approaches, we do not require the presence or identification of known alignment targets. On the other hand, the characteristics of each sensor modality, such as sampling pattern and information measured, are significantly different, making direct edge alignment difficult. To overcome this difficulty, we jointly fuse the data and estimate the calibration parameters. In particular, the joint processing evaluates and optimizes both the quality of edge alignment and the performance of the fusion algorithm using a common cost function on the output. We demonstrate accurate calibration in practical configurations in which depth measurements are sparse and contain no reflectivity information. Experiments on synthetic and real data obtained with a three-dimensional LIDAR sensor demonstrate the effectiveness of our approach.
KW - depth superresolution
KW - intersensor registration
KW - Multimodal calibration
KW - sensor fusion
KW - total variation
UR - https://www.scopus.com/pages/publications/84973333408
U2 - 10.1109/ICASSP.2016.7472200
DO - 10.1109/ICASSP.2016.7472200
M3 - Conference contribution
AN - SCOPUS:84973333408
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 2862
EP - 2866
BT - 2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016
Y2 - 20 March 2016 through 25 March 2016
ER -