Explore high-quality datasets for your AI and machine learning projects.
In this work, we introduce DROID (Distributed Robot Interaction Dataset), a diverse robot manipulation dataset with 76k demonstration trajectories or 350h of interaction data, collected across 564 scenes and 86 tasks by 50 data collectors in North America, Asia, and Europe over the course of 12 months.
The dataset was jointly created by the University of Genoa and the Italian Institute of Technology, focusing on the influence of object‑specific characteristics on human grasping and placing actions. It contains 1,200 pick‑and‑place actions performed by 15 participants, captured with a multimodal setup (multiple cameras, motion‑capture system, and wrist‑mounted inertial measurement units). The dataset aims to investigate intent recognition and motion generation for object interaction in robotics by analyzing human kinematics.