Dataset Name | Brief Description |
TILES-2018 | A longitudinal multimodal corpus of physiological/behavioral data from clinical providers (N=212 over 10 weeks) in a hospital for understanding stress, anxiety, well-being, etc. |
TILES-2019 | A longitudinal multimodal corpus of physiological, behavioral, and social interaction data from intensive care unit medical residents (N=57 over 3 weeks) for understanding stress, anxiety, well-being, etc. |
CEAP 360VR | Continuous Physiological and Behavioral Emotion Annotation Dataset for 360° Videos. 32 participants watched eight one-minute 360° video clips, and data includes physiological responses, head/eye movements, continuous valence-arousal annotations, and questionnaires (SSQ, IPQ, NASA-TLX). |
DAIC-WOZ | Clinical interviews designed to support the diagnosis of psychological distress conditions such as anxiety, depression, and PTSD. Data includes audio/video recordings and survey responses. |
ICT Rapport | Dyadic audio/visual communications between humans and either human or virtual agents. Data includes camera and microphone recordings of human participants, demographics, and their ratings of agent performance, trust, likeability, among others. |
GFT | Includes 172,800 video frames from 96 participants in 32 three-person groups during unscripted interactions. Data includes expert annotations of FACS occurrence and intensity, facial landmark tracking, and baseline results for various models. |
DynAMoS | The Dynamic Affective Movie Clip Database for Subjectivity Analysis (DynAMoS) is a curated collection of affective movie clips, metadata about those movie clips, and ratings of the emotional content of those movie clips from a large group of anonymous research participants. |
EMAP | The Emotional Arousal Pattern (EMAP) dataset provides psychophysiological responses to affective stimuli. It includes neuro- and peripheral physiological data, as well as valence and arousal ratings from 145 participants. The dataset encompasses 3,434 trials with moment-by-moment emotional intensity ratings, along with EEG, heart rate, respiration, skin conductance, and blood volume measures. |
POPANE | Psychophysiology of positive and negative emotions (POPANE) database is a large and comprehensive psychophysiological dataset on elicited emotions. This database involves recordings of 1157 cases from healthy individuals (895 individuals participated in a single session and 122 individuals in several sessions), collected across seven studies, a continuous record of self-reported affect along with several biosignals (electrocardiogram, impedance cardiogram, electrodermal activity, hemodynamic measures, e.g., blood pressure, respiration trace, and skin temperature). We experimentally elicited a wide range of positive and negative emotions, including amusement, anger, disgust, excitement, fear, gratitude, sadness, tenderness, and threat. |
Emognition Wearable Dataset 2020 | The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals’ high quality. |
GameVibe | GameVibe is a multimodal affect corpus featuring in-game behavioral observations and third-person affect traces for viewer engagement, supporting research in affective computing and streaming analytics. |
Share your affective computing dataset with the AAAC community
Register your affective computing dataset with us, improve its visibility, and share with the AAAC community. Simply email contact@aaac.world with the following information:
- A brief description (about 200 characters or less) of the dataset, highlighting its relevance to affective computing.
- A link to the dataset's homepage. The dataset's published paper should be clearly and easily accessible from this homepage.
For any problems such as broken links or requests to update or remove your content, email help@aaac.world