论文标题

合作感知系统强大的本地化错误

A Cooperative Perception System Robust to Localization Errors

论文作者

Song, Zhiying, Wen, Fuxi, Zhang, Hailiang, Li, Jun

论文摘要

合作感知到安全至关重要的自主驾驶应用是具有挑战性的。共享位置和姿势的错误导致相对变换估计不准确,并破坏了自我车辆的强大映射。我们提出了一个称为Optimatch的分布式对象级合作感知系统,其中连接的车辆之间共享检测到的3D边界框和本地状态信息。为了纠正嘈杂的相对变换,使用了两种连接车辆(边界框)的局部测量值,并开发了基于最佳的传输理论算法来滤除车辆共同检测到的对应关系,并构建了相关的可耐合设置。从匹配的对象对估计校正变换,并进一步应用于嘈杂的相对变换,然后进行全局融合和动态映射。实验结果表明,对于不同级别的位置和标题误差,实现了稳健的性能,而拟议的框架的表现优于最先进的基准融合方案,包括早期,晚和中间融合,平均在位置和/或头部误差的情况下,平均可以按很大的边距。

Cooperative perception is challenging for safety-critical autonomous driving applications.The errors in the shared position and pose cause an inaccurate relative transform estimation and disrupt the robust mapping of the Ego vehicle. We propose a distributed object-level cooperative perception system called OptiMatch, in which the detected 3D bounding boxes and local state information are shared between the connected vehicles. To correct the noisy relative transform, the local measurements of both connected vehicles (bounding boxes) are utilized, and an optimal transport theory-based algorithm is developed to filter out those objects jointly detected by the vehicles along with their correspondence, constructing an associated co-visible set. A correction transform is estimated from the matched object pairs and further applied to the noisy relative transform, followed by global fusion and dynamic mapping. Experiment results show that robust performance is achieved for different levels of location and heading errors, and the proposed framework outperforms the state-of-the-art benchmark fusion schemes, including early, late, and intermediate fusion, on average precision by a large margin when location and/or heading errors occur.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源