"Deep Reflectance Scanning: Recovering Spatially-varying Material Appearance from a Flash-lit Video Sequence"
Wenjie Ye, Yue Dong, Pieter Peers, and Baining Guo

Computer Graphics Forum, Volume 40, Issue 6, pages 409-427, September 2021
Abstract
In this paper we present a novel method for recovering high-resolution spatially-varying isotropic surface reflectance of a planar exemplar from a flash-lit close-up video sequence captured with a regular hand-held mobile phone. We dot not require careful calibration of the camera and lighting parameters, but instead compute a per-pixel flow map using a deep neural network to align the input video frames. For each video frame, we also extract the reflectance parameters, and warp the neural reflectance features directly using the per-pixel flow, and subsequently pool the warped features. Our method facilitates convenient hand-held acquisition of spatially-varying surface reflectance with commodity hardware by non-expert users. Furthermore, our method enables aggregation of reflectance features from surface points visible in only a subset of the captured video frames, enabling the creation of high-resolution reflectance maps that exceed the native camera resolution. We demonstrate and validate our method on a variety of synthetic and real-world spatially-varying materials.


Download
Supplementary Material
Code
Bibtex
@article{Ye:2021:DRS,
author = {Ye, Wenjie and Dong, Yue and Peers, Pieter and Guo, Baining},
title = {Deep Reflectance Scanning: Recovering Spatially-varying Material Appearance from a Flash-lit Video Sequence},
month = {September},
year = {2021},
pages = {409--427},
journal = {Computer Graphics Forum},
volume = {40},
number = {6},
doi = {https://doi.org/10.1111/cgf.14387},
}