- NeRF methods learn the density field based on light transports along straight
path
- When light paths intersect refractive objects, they may curve (dashed line),
depending on the angle of incidence
- We propose to bend the light rays by predicting position and direction offsets for
sample points along the rays
Abstract
Neural Radiance Fields (NeRF) have demonstrated exceptional capabilities in creating photorealistic
novel views
using volume rendering on a radiance field.
However, the intrinsic assumption of straight light rays within NeRF becomes a limitation when dealing
with
transparent or translucent objects that exhibit refraction, and therefore have curved light paths.
This hampers the ability of these approaches to accurately model the appearance of refractive objects,
resulting
in suboptimal novel view synthesis and geometry estimates.
To address this issue, we propose an innovative solution using deformable networks to learn a tailored
deformation field for refractive objects.
Our approach predicts position and direction offsets, allowing NeRF to model the curved light paths
caused by
refraction and therefore the complex and highly view-dependent appearances of refractive objects.
We also introduce a regularization strategy that encourages piece-wise linear light paths, since most
physical
systems can be approximated with a piece-wise constant index of refraction.
By seamlessly integrating our deformation networks into the NeRF framework, our method achieves
significant
improvements in rendering refractive objects from novel views.
Deng W, Campbell D, Sun C, Kanitkar S, Shaffer M, Gould S. Ray Deformation Networks for Novel View Synthesis of Refractive Objects.
In WACV, 2024.
(hosted on [Paper])