论文标题

Neilf:基于物理材料估计的神经入射光场

NeILF: Neural Incident Light Field for Physically-based Material Estimation

论文作者

Yao, Yao, Zhang, Jingyang, Liu, Jingbo, Qu, Yihang, Fang, Tian, McKinnon, David, Tsin, Yanghai, Quan, Long

论文摘要

我们为从多视图图像和重建的几何形状提供了一个可区分的渲染框架,用于材料和照明估算。在框架中,我们表示场景灯光为神经入射光场(NEILF)和材料特性,作为由多层感知器建模的表面BRDF。与最近将场景灯光视为2D环境图的方法相比,Neilf是一个完全5D光场,能够对任何静态场景进行建模。另外,闭塞和间接灯可以通过NEILF表示可以自然处理,而无需进行多次射线追踪,即使对于具有复杂的灯光和几何形状的场景,也可以估算材料特性。我们还提出了平滑度正则化和兰伯特(Lambertian)的假设,以减少优化过程中材料的歧义。我们的方法严格遵循基于物理的渲染方程,并通过可区分的渲染过程共同优化材料和照明。我们已经深入评估了内部合成数据集,DTU MVS数据集和现实世界BlendenDMVS场景的建议方法。我们的方法能够通过新颖的视图呈现质量来胜过以前的方法,从而为基于图像的材料和照明估计创造了新的最新方法。

We present a differentiable rendering framework for material and lighting estimation from multi-view images and a reconstructed geometry. In the framework, we represent scene lightings as the Neural Incident Light Field (NeILF) and material properties as the surface BRDF modelled by multi-layer perceptrons. Compared with recent approaches that approximate scene lightings as the 2D environment map, NeILF is a fully 5D light field that is capable of modelling illuminations of any static scenes. In addition, occlusions and indirect lights can be handled naturally by the NeILF representation without requiring multiple bounces of ray tracing, making it possible to estimate material properties even for scenes with complex lightings and geometries. We also propose a smoothness regularization and a Lambertian assumption to reduce the material-lighting ambiguity during the optimization. Our method strictly follows the physically-based rendering equation, and jointly optimizes material and lighting through the differentiable rendering process. We have intensively evaluated the proposed method on our in-house synthetic dataset, the DTU MVS dataset, and real-world BlendedMVS scenes. Our method is able to outperform previous methods by a significant margin in terms of novel view rendering quality, setting a new state-of-the-art for image-based material and lighting estimation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源