Enabling Robotic Dexterous Hands with Human-Like General Manipulation Capabilities
Enabling robotic dexterous hands with human-like general manipulation capabilities has been a significant research goal in embodied AI. However, due to the complex contact dynamics and underactuated system characteristics, current dexterous hand manipulation skills are often limited to specific scenarios, objects, or tasks, and there is still a considerable gap to achieving general skills. To bridge this gap, we attempt to learn general dexterous manipulation skills from human hand-object interaction motion data and explore a cross-embodiment tracking control paradigm. By capturing extensive motion data of human hand-object interactions, we can learn a generative hand-object interaction motion planner. This planner further enables dexterous hands to perform general manipulation of new objects under new goals through a cross-embodiment tracking control scheme. In this talk, I will share our recent work in three areas: interaction motion data capture, generative interaction planning, and cross-embodiment tracking control, and demonstrate the great potential of this paradigm in general dexterous manipulation tasks.