Contact2Grasp: 3D Grasp Synthesis via Hand-Object Contact Constraint


Haoming Li1
Xinzhuo Lin1
Yang Zhou2
Xiang Li2
Yuchi Huo1
Jiming Chen1
Qi Ye1

1Zhejiang University

2OPPO US Research Center

IJCAI 2023

[Paper]
[Poster]


Abstract

3D grasp synthesis generates grasping poses given an input object. Existing works tackle the problem by learning a direct mapping from objects to the distributions of grasping poses. However, because the physical contact is sensitive to small changes in pose, the high-nonlinear mapping between 3D object representation to valid poses is considerably non-smooth, leading to poor generation efficiency and restricted generality. To tackle the challenge, we introduce an intermediate variable for grasp contact areas to constrain the grasp generation; in other words, we factorize the mapping into two sequential stages by assuming that grasping poses are fully constrained given contact maps: 1) we first learn contact map distributions to generate the potential contact maps for grasps; 2) then learn a mapping from the contact maps to the grasping poses. Further, we propose a penetration-aware optimization with the generated contacts as a consistency constraint for grasp refinement. Extensive validations on two public datasets show that our method outperforms state-of-the-art methods regarding grasp generation on various metrics.


Video


BibTeX

@article{li2022contact2grasp, title={Contact2Grasp: 3D Grasp Synthesis via Hand-Object Contact Constraint}, author={Li, Haoming and Lin, Xinzhuo and Zhou, Yang and Li, Xiang and Huo, Yuchi and Chen, Jiming and Ye, Qi}, journal={arXiv preprint arXiv:2210.09245}, year={2022} }

Contact: Haoming Li, Qi Ye