Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: We present a novel approach for photorealistic robot simulation that
integrates 3D Gaussian Splatting as a drop-in renderer within vectorized
physics simulators such as IsaacGym. This enables unprecedented speed --
exceeding 100,000 steps per second on consumer GPUs -- while maintaining high
visual fidelity, which we showcase across diverse tasks. We additionally
demonstrate its applicability in a sim-to-real robotics setting. Beyond
depth-based sensing, our results highlight how rich visual semantics improve
navigation and decision-making, such as avoiding undesirable regions. We
further showcase the ease of incorporating thousands of environments from
iPhone scans, large-scale scene datasets (e.g., GrandTour, ARKit), and outputs
from generative video models like Veo, enabling rapid creation of realistic
training worlds. This work bridges high-throughput simulation and high-fidelity
perception, advancing scalable and generalizable robot learning. All code and
data will be open-sourced for the community to build upon. Videos, code, and
data available at https://escontrela.me/gauss_gym/.
Authors (7)
Alejandro Escontrela
Justin Kerr
Arthur Allshire
Jonas Frey
Rocky Duan
Carmelo Sferrazza
+1 more
Submitted
October 17, 2025
Key Contributions
GaussGym integrates 3D Gaussian Splatting with vectorized physics simulators (like Isaac Gym) to achieve unprecedented simulation speed (>100k steps/sec) and high visual fidelity. It enables learning locomotion from pixels, improves sim-to-real transfer, and facilitates rapid creation of diverse training environments from various sources.
Business Value
Dramatically accelerates robot training by enabling faster, more realistic simulations. This reduces development time and cost, and improves the reliability of robots deployed in the real world.