Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: We present DexCanvas, a large-scale hybrid real-synthetic human manipulation
dataset containing 7,000 hours of dexterous hand-object interactions seeded
from 70 hours of real human demonstrations, organized across 21 fundamental
manipulation types based on the Cutkosky taxonomy. Each entry combines
synchronized multi-view RGB-D, high-precision mocap with MANO hand parameters,
and per-frame contact points with physically consistent force profiles. Our
real-to-sim pipeline uses reinforcement learning to train policies that control
an actuated MANO hand in physics simulation, reproducing human demonstrations
while discovering the underlying contact forces that generate the observed
object motion. DexCanvas is the first manipulation dataset to combine
large-scale real demonstrations, systematic skill coverage based on established
taxonomies, and physics-validated contact annotations. The dataset can
facilitate research in robotic manipulation learning, contact-rich control, and
skill transfer across different hand morphologies.
Authors (12)
Xinyue Xu
Daisy
Jieqiang Sun
Daisy
Jing
Daisy
Dai
Siyuan Chen
Lanjie Ma
+6 more
Submitted
October 17, 2025
Key Contributions
DexCanvas is a large-scale hybrid real-synthetic dataset for dexterous manipulation, containing 7,000 hours of interactions derived from 70 hours of human demonstrations across 21 manipulation types. It uniquely combines multi-view RGB-D, high-precision mocap with MANO hand parameters, and physically consistent contact force profiles. The paper also presents a real-to-sim pipeline using RL to train policies for an actuated MANO hand in simulation, reproducing human demonstrations and discovering contact forces.
Business Value
Accelerates the development of robots capable of complex manipulation tasks, opening doors for automation in manufacturing, logistics, and domestic assistance.