Explore high-quality datasets for your AI and machine learning projects.
We provide the NEFER dataset for neuromorphic event‑driven facial expression recognition. It consists of paired RGB and event videos of human faces, annotated with corresponding emotions and facial bounding boxes and landmarks. The dataset includes RGB and event camera sequences; volunteers view each video, which is labeled with one of the seven universal emotions defined by Paul Ekman. Annotation followed a 1‑annotator, 2‑reviewer, 1‑clinical‑expert verification workflow. The data are intended for training AI models for automatic chest CT segmentation (note: description appears to be mismatched; ensure correct context).