论文标题

NERF DAIN:凝视估计的头眼重定向参数模型

NeRF-Gaze: A Head-Eye Redirection Parametric Model for Gaze Estimation

论文作者

Yin, Pengwei, Dai, Jiawu, Wang, Jingjing, Xie, Di, Pu, Shiliang

论文摘要

凝视估计是许多视觉任务的基本基础。然而,以3D注释获取凝视数据集的高成本阻碍了凝视估计模型的优化和应用。在这项工作中,我们提出了一个基于神经辐射场的新型头眼重定向参数模型,该模型允许具有视图一致性和准确的凝视方向的密集凝视数据生成。此外,我们的头眼重定向参数模型可以将面部和眼睛解散以形成单独的神经渲染,因此它可以实现单独控制面部属性,身份,照明和眼睛凝视方向的目的。因此,可以通过以无监督的方式来操纵属于不同面部归因的潜在代码来获得多样化的3D感知凝视数据集。对几个基准测试的广泛实验证明了我们方法在域概括和域适应性估计任务中的有效性。

Gaze estimation is the fundamental basis for many visual tasks. Yet, the high cost of acquiring gaze datasets with 3D annotations hinders the optimization and application of gaze estimation models. In this work, we propose a novel Head-Eye redirection parametric model based on Neural Radiance Field, which allows dense gaze data generation with view consistency and accurate gaze direction. Moreover, our head-eye redirection parametric model can decouple the face and eyes for separate neural rendering, so it can achieve the purpose of separately controlling the attributes of the face, identity, illumination, and eye gaze direction. Thus diverse 3D-aware gaze datasets could be obtained by manipulating the latent code belonging to different face attributions in an unsupervised manner. Extensive experiments on several benchmarks demonstrate the effectiveness of our method in domain generalization and domain adaptation for gaze estimation tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源