Explore high-quality datasets for your AI and machine learning projects.
HOI-dataset is a depth‑map hand‑part segmentation dataset for hand–object interaction, providing download links for training and validation sets.
BEHAVE is a dataset that captures full‑body human‑object interactions in natural environments. It provides multi‑view RGB‑D frames together with corresponding 3D SMPL and object fittings, as well as annotated contacts between them.
The dataset comprises 38 series of 30‑view RGB‑D video sequences, each accompanied by camera parameters, foreground masks, SMPL models, and assorted point‑cloud and mesh files. Every video is captured at 4K resolution, 25 FPS, and lasts between 1 and 19 seconds. All 30 viewpoints were recorded using Azure Kinect devices in a unified surrounding scene.
The MC‑EIU dataset, created by Inner Mongolia University and partner institutions, is a comprehensive multimodal dialogue dataset for joint emotion and intent understanding. It contains 4,970 dialogue video clips (56,012 utterances) covering 7 emotions and 9 intents, supporting text, acoustic, and visual modalities in both English and Mandarin. The dataset was built through data collection, preprocessing, and multi‑round annotation to ensure quality and diversity. MC‑EIU is aimed at human‑computer interaction research, enhancing machine understanding of human needs and empathy in conversational systems.