论文标题
ObjectFolder 2.0:用于SIM2REAL传输的多感觉对象数据集
ObjectFolder 2.0: A Multisensory Object Dataset for Sim2Real Transfer
论文作者
论文摘要
物体在我们的日常活动中起着至关重要的作用。尽管以对象为中心的多感官学习表现出了巨大的潜力,但先前工作中对象的建模是不现实的。 ObjectFolder 1.0是一个最近的数据集,它引入了100个具有视觉,声学和触觉感觉数据的虚拟化对象。但是,数据集的规模很小,多感官数据的质量有限,从而阻碍了对现实情况的概括。我们介绍了ObjectFolter 2.0,这是一种大规模的,多感的家庭对象数据集,以隐式神经表示形式形式,可在三个方面显着增强对象文件夹1.0。首先,我们的数据集的对象和数量级的范围更快的渲染时间量大10倍。其次,我们显着提高了所有三种方式的多感官渲染质量。第三,我们表明,从数据集中的虚拟对象中学到的模型在三个具有挑战性的任务中成功地将其转移到其现实世界中的模型:对象量表估计,接触本地化和形状重建。 ObjectFolder 2.0为计算机视觉和机器人技术中的多感觉学习提供了一条新的路径和测试。该数据集可从https://github.com/rhgao/objectfolder获得。
Objects play a crucial role in our everyday activities. Though multisensory object-centric learning has shown great potential lately, the modeling of objects in prior work is rather unrealistic. ObjectFolder 1.0 is a recent dataset that introduces 100 virtualized objects with visual, acoustic, and tactile sensory data. However, the dataset is small in scale and the multisensory data is of limited quality, hampering generalization to real-world scenarios. We present ObjectFolder 2.0, a large-scale, multisensory dataset of common household objects in the form of implicit neural representations that significantly enhances ObjectFolder 1.0 in three aspects. First, our dataset is 10 times larger in the amount of objects and orders of magnitude faster in rendering time. Second, we significantly improve the multisensory rendering quality for all three modalities. Third, we show that models learned from virtual objects in our dataset successfully transfer to their real-world counterparts in three challenging tasks: object scale estimation, contact localization, and shape reconstruction. ObjectFolder 2.0 offers a new path and testbed for multisensory learning in computer vision and robotics. The dataset is available at https://github.com/rhgao/ObjectFolder.