Differentiable 3D Scene Representations with Point-based Neural Methods

Author(s)
Borcsok, Barnabas Barney
Editor(s)
Associated Organization(s)
Organizational Unit
Organizational Unit
School of Interactive Computing
School established in 2007
Supplementary to:
Abstract
This thesis explores particle-based approaches for scene representation. We introduce moving patches by fitting geometry pixels ("gexels") of moving geometry images onto the scene geometry. We propose to represent a signed distance field as a weighted sum of local functions attached to moving particles. We consider optimization frameworks to fit both 2D and 3D target shapes and propose a splitting strategy to control the density of particles. Experiments on canonical 3D models demonstrate our method’s viability to capture coarse and fine geometric structures with explainable local patches, offering a storage‑efficient alternative to current representations. Finally, we outline avenues for future work, such as continuous neural patch parametrization, adaptive domain decomposition, and interactive editing. We hope to inspire further investigation into explainable, flexible, and differentiable explicit geometry paradigms that balance fidelity, efficiency and editability.
Sponsor
Date
2025-04-30
Extent
Resource Type
Text
Resource Subtype
Thesis
Rights Statement
Rights URI