Abstract
Perception is one of the key abilities of autonomous mobile robotic systems, which often relies on fusion of heterogeneous sensors. Although this heterogeneity presents a challenge for sensor calibration, it is also the main prospect for reliability and robustness of autonomous systems. In this paper, we propose a method for multisensor calibration based on Gaussian processes (GPs) estimated moving object trajectories, resulting with temporal and extrinsic parameters. The appealing properties of the proposed temporal calibration method are: coordinate frame invariance, thus avoiding prior extrinsic calibration, theoretically grounded batch state estimation and interpolation using GPs, computational efficiency with O(n) complexity, leveraging data already available in autonomous robot platforms, and the end result enabling 3D point-to-point extrinsic multisensor calibration. The proposed method is validated both in simulations and real-world experiments. For real-world experiment we evaluated the method on two multisensor systems: an externally triggered stereo camera, thus having temporal ground truth readily available, and a heterogeneous combination of a camera and motion capture system. The results show that the estimated time delays are accurate up to a fraction of the fastest sensor sampling time.
Abstract (translated)
感知能力是自主移动机器人系统的关键能力之一,它往往依赖于异构传感器的融合。尽管这种异质性对传感器校准提出了挑战,但它也是自主系统可靠性和鲁棒性的主要前景。本文提出了一种基于高斯过程(GPS)估计运动目标轨迹的多传感器标定方法。所提出的时间校正方法具有以下吸引人的特性:坐标系不变性,从而避免了先前的非本征校正;理论上基于GPS的批量状态估计和插值;具有O(N)复杂性的计算效率;利用自主机器人平台中已有的数据;以及最终结果。lt启用三维点对点外部多传感器校准。该方法在仿真和实际实验中都得到了验证。在实际实验中,我们评估了两个多传感器系统上的方法:一个外部触发的立体摄像机,因此很容易获得时间地面真实性,以及摄像机和运动捕捉系统的异构组合。结果表明,估计的时间延迟精确到最快传感器采样时间的一小部分。
URL
https://arxiv.org/abs/1904.04187