Explore high-quality datasets for your AI and machine learning projects.
The OCID Grasp dataset was created by the Institute of Computer Graphics and Vision at Graz University of Technology, Austria. It extends the original OCID dataset with 1,763 RGB‑D images, over 11.4 k object segmentation masks, and more than 75 k manually annotated grasp candidates. Each object is assigned to one of 31 categories. The dataset supports research on robot grasp detection in complex scenes by combining semantic segmentation with grasp detection.
ACRONYM is a robot grasp planning dataset jointly created by NVIDIA and the University of Washington. It comprises 17.744 million parallel‑jaw grasp samples over 8,872 objects from ShapeNetSem, spanning 262 categories. Grasp outcomes are labeled using the physical simulation engine FleX, providing high‑density and physically realistic grasps. Samples were generated via simulation with an anti‑symmetry sampling scheme and recorded in a zero‑gravity environment, noting success or failure. ACRONYM is intended to improve robotic grasping in complex settings, especially for learning‑driven grasp algorithm training and real‑world precision grasping.