Neural Inverse Rendering for Glinty Materials

Neural inverse rendering has excelled in disentangling geometry, material, and lighting from input images, enabling scene relighting. However, prevailing pipelines relying on the Disney Bidirectional Reflectance Distribution Function (BRDF) encounter limitations in handling glinty materials due to their reliance on a smooth Normal Distribution Function (NDF). This project introduces a novel glint reconstruction method within the neural inverse rendering framework. The method features a differentiable glint sampler facilitating backpropagation, streamlining the noise generation process for glint sampling to preserve distribution and enhance efficiency. Additionally, it introduces a method for fine-grained control over glint appearance in occluded areas and a straightforward yet effective parameters approximation method. Experimental results affirm the project's pioneering status in the reconstruction of glinty materials within the neural inverse rendering paradigm.

Method Overview

This project introduces a novel glints reconstruction method. This method focuses on the explicit learning of the glinty appearance. Herein, the geometry is as constants. The albedo and normals are used directly from the 2D textures, whereas only the roughness values from the ORM map are used. Furthermore, noises used to sample glints are stored in textures. Singularly and significantly, the parameters germane to the generation of glints are estimated by a counting process, also known as the parameters estimator, and kept constant for the subsequent training. The training then refines the material and lighting parameters by backpropagating the image-space loss.

Crucial to the glints reconstruction method is the differentiable glints sampler. The sampler takes information from the mesh, material textures, noise maps, and parameters estimator and inputs them into the glints sampling function. This function outputs the specular value D. D is processed during the shading phase to model occlusion. The resulting image is then compared to the reference to obtain the loss for updating the material and lighting parameters.

Results Showcase

Some results of this project are shown below.

Bob with Clarens Environment Map Reference

Duck with Golf Environment Map Reference

Spot with Road Environment Map Reference

Bob with Clarens Environment Map Rendered

Duck with Golf Environment Map Rendered

Spot with Road Environment Map Rendered

The results show that this project has successfully reconstructed glinty materials robust to different geometries and environment maps.

Thesis

To read the full thesis, please check out the link below.

GitHub

Below is the link to the project GitHub repository. You can find the full source code and instructions on how to run it.


Acknowledgements

I want to send my highest gratitude to those who have helped me with this project. I want to thank my supervisor Dongqing Wang for providing faithful guidance along the way. I also want to thank Michael Ebenstein for giving me crucial feedback to improve this project. At last, I want to thank myself for holding on through stressful times and making the success of this project possible.

Previous
Previous

C++ and GLSL Signed Distance Function for Character Rendering

Next
Next

C++ Photo Realistic Rendering Glint Effect: Discrete Microfacet with Stochastic Algorithm