Publications

Probabilistic Directed Distance Fields for Ray-Based Shape Representations

Published

IEEE Transactions on Pattern Analysis and Machine Intelligence

Date

2025.07.31

Research Areas

Abstract

In modern computer vision, the optimal representation of 3D shape remains task-dependent. One fundamental operation applied to such representations is differentiable rendering, which enables learning-based inverse graphics approaches. Standard explicit representations are often easily rendered, but can suffer from limited geometric fidelity, among other issues. On the other hand, implicit representations generally preserve greater fidelity, but suffer from difficulties with rendering, limiting scalability. In this work, we devise Directed Distance Fields (DDFs), which map a ray or oriented point (position and direction) to surface visibility and depth. This enables efficient differentiable rendering, obtaining depth with a single forward pass per pixel, as well as higher-order geometry with only additional backward passes. Using probabilistic DDFs (PDDFs), we can model the inherent discontinuities in the underlying field. We then apply DDFs to single-shape fitting, generative modelling, and 3D reconstruction, showcasing strong performance with simple architectural components via the versatility of our representation. Finally, since the dimensionality of DDFs permits view-dependent geometric artifacts, we conduct a theoretical investigation of the constraints necessary for view consistency. We find a small set of field properties that are sufficient to guarantee a DDF is consistent, without knowing which shape the field is expressing.