Explore the groundbreaking GraspM3 dataset with visualizations of dexterous hand grasping simulations.
The GraspM3 dataset is a large-scale dataset for dexterous hand grasping, featuring over 8,000 objects and 1,000,000 grasping motion trajectories. It includes comprehensive semantic annotations, such as object categories, grasp quality, and contact details. Simulations were validated in NVIDIA Isaac Gym, supporting efficient large-scale parallel computations.
This section demonstrates collective grasping simulations performed with multiple objects and dexterous hands simultaneously.
This section showcases detailed visualizations of single-object grasping, highlighting the interaction between the dexterous hand and individual objects.
Interactive HTML visualizations allow users to explore simulated grasping processes directly in the browser, including mano hand and shadow hand (top and bottom row).