论文标题

MSL-RAPTOR:一个6DOF的相对姿势跟踪器,用于机器人感知

MSL-RAPTOR: A 6DoF Relative Pose Tracker for Onboard Robotic Perception

论文作者

Ramtoula, Benjamin, Caccavale, Adam, Beltrame, Giovanni, Schwager, Mac

论文摘要

确定对象在环境中的相对位置和方向是广泛的机器人应用程序的基本构建块。为了在实际设置中有效地完成此任务,方法必须快速,使用通用传感器并轻松地概括为新的对象和环境。我们提出了MSL-Raptor,这是一种两阶段的算法,用于跟踪具有单眼相机的刚体。该图像首先是由有效的基于神经网络的前端处理的,以检测新对象并在框架之间进行跟踪2D边界框。类标签和边界框传递给了后端,该后端使用无味的Kalman滤镜(UKF)更新对象的姿势。测量后验回到2D跟踪器以提高鲁棒性。确定对象类的类别,因此可以使用自定义动态和约束,可以使用特定于类的UKF。适应跟踪新类的姿势仅需要提供经过训练的2D对象检测器或标记为2D边界框数据以及对象的大致大小。首先在NOCS-REAL275数据集上验证了MSL-Raptor的性能,尽管不使用深度测量值,但仍可以实现与RGB-D方法相当的结果。当从另一台无人机上跟踪飞行无人机时,它的速度最快可比较方法的表现分为3倍,同时,较低的翻译和旋转中值错误分别降低了66%和23%。

Determining the relative position and orientation of objects in an environment is a fundamental building block for a wide range of robotics applications. To accomplish this task efficiently in practical settings, a method must be fast, use common sensors, and generalize easily to new objects and environments. We present MSL-RAPTOR, a two-stage algorithm for tracking a rigid body with a monocular camera. The image is first processed by an efficient neural network-based front-end to detect new objects and track 2D bounding boxes between frames. The class label and bounding box is passed to the back-end that updates the object's pose using an unscented Kalman filter (UKF). The measurement posterior is fed back to the 2D tracker to improve robustness. The object's class is identified so a class-specific UKF can be used if custom dynamics and constraints are known. Adapting to track the pose of new classes only requires providing a trained 2D object detector or labeled 2D bounding box data, as well as the approximate size of the objects. The performance of MSL-RAPTOR is first verified on the NOCS-REAL275 dataset, achieving results comparable to RGB-D approaches despite not using depth measurements. When tracking a flying drone from onboard another drone, it outperforms the fastest comparable method in speed by a factor of 3, while giving lower translation and rotation median errors by 66% and 23% respectively.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源