PhotonSplat: Splatting Photon for Scene Reconstruction with High-Speed Camera Captures

Manuscript under progress
1Indian Institute of Technology, Madras 2Carnagie Mellon University, Pittsberg

* indicates equal contribution.

Teaser Figure

In applications such as drone surveillance with high-speed camera motion, RGB captures tend to be extremely blurry, making it challenging to model the 3D structure accurately. To address this, we use Single Photon Camera (SPC) which captures high frame rate binary images, free from motion blur. Our proposed method provides high-quality geometry reconstruction. We then colorise the grayscale rendered images using one blurry RGB frame to achieve view-consistent colorization of the 3D structure. Finally, this process allows appearance editing through explicit control over the scene.

Abstract

Neural rendering techniques can synthesize photorealistic novel-view images of scenes from a collection of multi-view imagery. However, they often fail when the input imagery is motion blurred, a scenario that often presents itself in high speed camera motion and object motion. In this paper, we advance neural rendering techniques in high speed camera motion setting using single-photon avalanche diode (SPAD) arrays, an emerging sensing technology capable of sensing images at hundreds of thousands of frames per second. However, SPADs come with their own set of challenges: they produce binary frames that are inherently noisy, and the resulting averaged frames are blurry monochromatic images. For these monochromatic images the conventional 2D colorization methods yield inconsistent results. These limitations reduce the efficacy of SPAD data in downstream tasks such as depth estimation and segmentation. To address these challenges, we introduce \textbf{PhotonSplat}, a framework designed to reconstruct 3D scenes directly from SPAD binary images, effectively navigating the noise vs. blur trade-off. Our approach incorporates a 3D spatial filtering technique to reduce noise within novel-view reconstructions. Moreover, we extend this framework to support view-consistent colorization of grayscale reconstructions from a single blurred reference color image. Additionally, PhotonSplat facilitates simple editing tasks due to its explicit scene modeling capabilities. We further contribute \textit{PhotonScenes}, a real-world multi-view dataset captured with the SPAD sensors.

Novel View Synthesis Results on Synthetic Dataset

results gallery

Novel View Synthesis Results on Real Dataset

results gallery

View Consistent Colourisation from a Single RGB Image

results gallery