1Peking University
2Galbot
3Beijing Academy of Artificial Intelligence
* equal contributions
† corresponding author
For any grasp type in the GRASP taxonomy, any object, and any articulated hand, our pipeline efficiently synthesizes contact-rich, penetration-free, and physically plausible dexterous grasps, starting from only one human-annotated grasp template to specify an initial hand pose and contact information per hand and grasp type.
Generalizable dexterous grasping with suitable grasp types is a fundamental skill for intelligent robots. Developing such skills requires a large-scale and high-quality dataset that covers numerous grasp types (i.e., at least those categorized by the GRASP taxonomy), but collecting such data is extremely challenging. Existing automatic grasp synthesis methods are often limited to specific grasp types or object categories, hindering scalability. This work proposes an efficient pipeline capable of synthesizing contact-rich, penetration-free, and physically plausible grasps for any grasp type, object, and articulated hand. Starting from a single human-annotated template for each hand and grasp type, our pipeline tackles the complicated synthesis problem with two stages: optimize the object to fit the hand template first, and then locally refine the hand to fit the object in simulation. To validate the synthesized grasps, we introduce a contact-aware control strategy that allows the hand to apply the appropriate force at each contact point to the object. Those validated grasps can also be used as new grasp templates to facilitate future synthesis. Experiments show that our method significantly outperforms previous type-unaware grasp synthesis baselines in simulation. Using our algorithm, we construct a dataset containing 10.7k objects and 9.5M grasps, covering 31 grasp types in the GRASP taxonomy. Finally, we train a type-conditional generative model that successfully performs the desired grasp type from single-view object point clouds, achieving an 82.3% success rate in real-world experiments
@article{chen2025dexonomy,
title={Dexonomy: Synthesizing All Dexterous Grasp Types in a Grasp Taxonomy},
author={Chen, Jiayi and Ke, Yubin and Peng, Lin and Wang, He},
journal={Robotics: Science and Systems},
year={2025}
}
If you have any questions, please feel free to contact Jiayi Chen at jiayichen@pku.edu.cn, Yubin Ke at 2200013213@stu.pku.edu.cn, and He Wang at hewang@pku.edu.cn.