DexGraspNet: A Large-Scale Robotic Dexterous Grasp Dataset for General Objects Based on Simulation

ICRA 2023


Ruicheng Wang1*    Jialiang Zhang1*    Jiayi Chen1,2    Yinzhen Xu1,2    Puhao Li2,3    Tengyu Liu2    He Wang1†

1Peking University    2Beijing Institute for General Artificial Intelligence    3Tsinghua University   

* equal contributions   corresponding author  


input

A visualization of DexGraspNet. DexGraspNet contains 1.32M grasps of ShadowHand on 5355 objects, which is two orders and one order of magnitudes larger than the previous dataset from DDG. It features diverse types of grasping that can't be achieved using GraspIt!.


Abstract


Robotic dexterous grasping is the first step to enable human-like dexterous object manipulation and thus a crucial robotic technology. However, dexterous grasping is much more under-explored than object grasping with parallel grippers, partially due to the lack of a large-scale dataset. In this work, we present a large-scale robotic dexterous grasp dataset, DexGraspNet, generated by our proposed highly efficient synthesis method that can be generally applied to any dexterous hand. Our method leverages a deeply accelerated differentiable force closure estimator and thus can efficiently and robustly synthesize stable and diverse grasps on a large scale. We choose ShadowHand and generate 1.32 million grasps for 5355 objects, covering more than 133 object categories and containing more than 200 diverse grasps for each object instance, with all grasps having been validated by the Isaac Gym simulator. Compared to the previous dataset from Liu et al. generated by GraspIt!, our dataset has not only more objects and grasps, but also higher diversity and quality. Via performing cross-dataset experiments, we show that training several algorithms of dexterous grasp synthesis on our dataset significantly outperforms training on the previous one.


Video




Qualitative results


Some diverse grasps on the objects from DexGraspNet.


Some grasps using different dexterous hand. From left to right: ShadowHand, MANO, Allegro.



Citation


@article{wang2022dexgraspnet,
  title={DexGraspNet: A Large-Scale Robotic Dexterous Grasp Dataset for General Objects Based on Simulation},
  author={Wang, Ruicheng and Zhang, Jialiang and Chen, Jiayi and Xu, Yinzhen and Li, Puhao and Liu, Tengyu and Wang, He},
  journal={arXiv preprint arXiv:2210.02697},
  year={2022}
}

Contact


If you have any questions, please feel free to contact Ruicheng Wang at wrc0326@stu.pku.edu.cn, Jialiang Zhang at jackzhang0906@126.com, and He Wang at hewang@pku.edu.cn.