Tool-Mediated Perception of Liquid-like Substances Using Multiple Sensory Modalities

May 2, 2024
12:00pm EST
Cummings 302
Speaker: Si Liu - Quals talk
Host: Jivko Sinapov

Abstract

Quals talk:

To effectively understand the properties of various objects, robots need to extend beyond mere visual perception to also incorporate tactile, auditory, and proprioceptive feedback from interactions. This research specifically investigates robot’s interactions with liquid-like objects (e.g., wheat, water) within a container, utilizing tools (e.g., spoon, chopstick) and a range of behaviors (e.g., stirring, poking) to gather multisensory data. Through a tool-mediated experimental approach, we employ multimodal deep learning to explore the effectiveness of integrating audio, tactile, and haptic modalities. We also examine the performance changes when the robot uses different tools and behaviors. Preliminary findings highlight the complexities and challenges associated with limited interaction data and modality feature extraction. This talk will address the lessons learned and propose future research directions focused on new data collection, applying knowledge transfer to new tools and behaviors, and training the robot to learn exploratory behavior automatically. These steps are critical for enhancing the robustness and adaptability of robotic systems in handling real-world tasks.