NEFER
We provide the NEFER dataset for neuromorphic event‑driven facial expression recognition. It consists of paired RGB and event videos of human faces, annotated with corresponding emotions and facial bounding boxes and landmarks. The dataset includes RGB and event camera sequences; volunteers view each video, which is labeled with one of the seven universal emotions defined by Paul Ekman. Annotation followed a 1‑annotator, 2‑reviewer, 1‑clinical‑expert verification workflow. The data are intended for training AI models for automatic chest CT segmentation (note: description appears to be mismatched; ensure correct context).
Description
Dataset Overview
Name: NEFER
Purpose: Neuromorphic event‑driven facial expression recognition
Content: Paired RGB and event videos of human faces, annotated with corresponding emotions and facial bounding boxes and landmarks.
Emotion Labels:
- Disgust
- Contempt
- Happiness
- Fear
- Anger
- Surprise
- Sadness
Additional Label: "None" for instances where volunteers perceived no emotion.
Dataset Structure:
event_raw: Raw event‑camera video foldersevent_frames: Event frames obtained using time‑binary encoding[2]rgb_frames: RGB video frame foldersannotations: Multiple CSV files for training and validation, corresponding to RGB and event data (expected emotions). Each file also has a “subjective” version (reported emotions) provided by users.
Training Set Users: [01, 02, 04, 05, 06, 08, 09, 10, 11, 12, 13, 14, 15, 16,21, 22, 23, 24, 25, 26]
Validation Set Users: [03, 07, 17, 19, 27, 28]
Additional Annotations: Facial landmarks and bounding boxes will be provided.
Download Link: Google Drive
AI studio
Generate PPTs instantly with Nano Banana Pro.
Generate PPT NowAccess Dataset
Please login to view download links and access full dataset details.
Topics
Source
Organization: github
Created: 4/13/2023
Power Your Data Analysis with Premium AI Models
Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.
Enjoy a free trial and save 20%+ compared to official pricing.