Sim2Real2Sim: Differentiable Physics-Based Modeling of Multisensory Objects

Talk
Ruohan Gao
Time: 
10.18.2024 11:00 to 12:00

We perceive the world not as a single giant entity but often through perception and manipulation of a wide variety of objects, which exist as bounded wholes and move on connected paths. While there has been significant progress by "looking"—recognizing objects based on glimpses of their visual appearance or 3D shape—objects in the world are often modeled as silent and untouchable entities. In this talk, I will present how we model the multisensory signals of real-world objects through differentiable physics-based simulation. First, I will discuss how we model multisensory behaviors of objects with a new dataset of neural objects and how we perform Sim2Real transfer by learning from them. Then, vice versa, I will introduce our differentiable inverse rendering algorithms for Real2Sim applications, where we infer a variety of physical properties of objects from their real-world observations. Together, this has the potential to endow a system with the ability to autonomously build its own multisensory simulation of its environment using only its raw sensory inputs.