论文标题
在城市场景中的对象引导的昼夜视觉本地化
Object-Guided Day-Night Visual Localization in Urban Scenes
论文作者
论文摘要
我们基于一种新型的局部特征匹配方法介绍对象引导的定位(OGUL)。直接匹配本地特征对照明的重大变化敏感。相比之下,物体检测通常在照明条件下幸存下来。提出的方法首先检测语义对象,并建立图像之间这些对象的对应关系。对象对应关系以平面同构象的形式提供图像的局部粗对齐。因此,这些同谱部用于指导本地功能的匹配。标准城市本地化数据集(AACHEN,扩展CMU季,机器人季)的实验表明,Ogul具有与SIFT一样简单的本地特征,可以显着改善本地化结果,并且其性能与最先进的基于CNN的方法竞争,该方法训练了每天夜间本地化的培训。
We introduce Object-Guided Localization (OGuL) based on a novel method of local-feature matching. Direct matching of local features is sensitive to significant changes in illumination. In contrast, object detection often survives severe changes in lighting conditions. The proposed method first detects semantic objects and establishes correspondences of those objects between images. Object correspondences provide local coarse alignment of the images in the form of a planar homography. These homographies are consequently used to guide the matching of local features. Experiments on standard urban localization datasets (Aachen, Extended-CMU-Season, RobotCar-Season) show that OGuL significantly improves localization results with as simple local features as SIFT, and its performance competes with the state-of-the-art CNN-based methods trained for day-to-night localization.