PhD Proposal: Feel, Think, Act: Tactile Perception for Robotics

Talk
Amir Hossein Shahidzadeh
Time: 
02.06.2025 11:00 to 12:30
Location: 

IRB IRB-4109

Humans intuitively control their hands to manipulate objects with diverse properties, a skill that's hard to explain explicitly. For robots to achieve similar dexterity, they require sensory data, robust sensory-motor coupling, and tactile sensing—enabled by advancements in touch sensing. My research develops decision-making algorithms like Reinforcement Learning to adapt policies efficiently, using multi-modal state representations that transform implicit human goals into explicit, actionable subgoals.
In this talk, I will discuss perception tools derived from tactile sensors, such as 3D force vectors, and highlight the critical yet often-overlooked need for active tactile exploration policies that efficiently gather data within a limited number of actions. Within our tactile-only object exploration framework, we define a state representation encoding temporal tactile information and use bonus-based exploration inspired by Upper Confidence Bound (UCB) algorithms. We further extend this to guided bi-manual exploration, rewarding actions that accelerate discovering pose-related features, paving the way for objectives like pose estimation beyond mere reconstruction.