Multi-modal 3D Gaussian Splatting for SLAM

Date

2024

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

From AR/VR to autonomous mobile robotics, Simultaneous Localization and Mapping (SLAM) is essential for tracking and scene understanding. 3D Gaussian Splatting (3DGS) offers a map representation capable of photorealistic reconstruction and real-time rendering of scenes using multiple posed cameras. By combining these two techniques, this thesis aims to show that a 3D Gaussian map representation is capable of accurate SLAM when given unposed RGB-D images and inertial measurements. The proposed method, MM3DGS, addresses the limitations of prior neural radiance field representations by enabling faster rendering, scale awareness, and improved trajectory tracking. In addition, a new multi-modal SLAM dataset, UT-MM, is collected from a mobile robot and is publicly released. Experimental evaluation on several scenes from the dataset shows that with the proper sensor conf iguration, MM3DGS achieves 3× improvement in tracking and 5% improvement in photometric rendering quality compared to the current 3DGS SLAM state-of-the-art, while allowing real-time rendering of a high-resolution dense 3D map.

Description

LCSH Subject Headings

Citation

Collections