To facilitate the advancement of affective computing, relevant affective computing datasets, software, and courses promoted by members of the AAAC community are listed below. Interested in sharing your content with AAAC? Click on a tab below and scroll to the bottom for instructions on sharing.

Dataset Name Brief Description
TILES-2018 A longitudinal multimodal corpus of physiological/behavioral data from clinical providers (N=212 over 10 weeks) in a hospital for understanding stress, anxiety, well-being, etc.
TILES-2019 A longitudinal multimodal corpus of physiological, behavioral, and social interaction data from intensive care unit medical residents (N=57 over 3 weeks) for understanding stress, anxiety, well-being, etc.
CEAP 360VR Continuous Physiological and Behavioral Emotion Annotation Dataset for 360° Videos. 32 participants watched eight one-minute 360° video clips, and data includes physiological responses, head/eye movements, continuous valence-arousal annotations, and questionnaires (SSQ, IPQ, NASA-TLX).
DAIC-WOZ Clinical interviews designed to support the diagnosis of psychological distress conditions such as anxiety, depression, and PTSD. Data includes audio/video recordings and survey responses.
ICT Rapport Dyadic audio/visual communications between humans and either human or virtual agents. Data includes camera and microphone recordings of human participants, demographics, and their ratings of agent performance, trust, likeability, among others.
GFT Includes 172,800 video frames from 96 participants in 32 three-person groups during unscripted interactions. Data includes expert annotations of FACS occurrence and intensity, facial landmark tracking, and baseline results for various models.
DynAMoS The Dynamic Affective Movie Clip Database for Subjectivity Analysis (DynAMoS) is a curated collection of affective movie clips, metadata about those movie clips, and ratings of the emotional content of those movie clips from a large group of anonymous research participants.
EMAP The Emotional Arousal Pattern (EMAP) dataset provides psychophysiological responses to affective stimuli. It includes neuro- and peripheral physiological data, as well as valence and arousal ratings from 145 participants. The dataset encompasses 3,434 trials with moment-by-moment emotional intensity ratings, along with EEG, heart rate, respiration, skin conductance, and blood volume measures.
POPANE Psychophysiology of positive and negative emotions (POPANE) database is a large and comprehensive psychophysiological dataset on elicited emotions. This database involves recordings of 1157 cases from healthy individuals (895 individuals participated in a single session and 122 individuals in several sessions), collected across seven studies, a continuous record of self-reported affect along with several biosignals (electrocardiogram, impedance cardiogram, electrodermal activity, hemodynamic measures, e.g., blood pressure, respiration trace, and skin temperature). We experimentally elicited a wide range of positive and negative emotions, including amusement, anger, disgust, excitement, fear, gratitude, sadness, tenderness, and threat.
Emognition Wearable Dataset 2020 The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals’ high quality.
GameVibe GameVibe is a multimodal affect corpus featuring in-game behavioral observations and third-person affect traces for viewer engagement, supporting research in affective computing and streaming analytics.

 

Share your affective computing dataset with the AAAC community

Register your affective computing dataset with us, improve its visibility, and share with the AAAC community. Simply email contact@aaac.world with the following information:

  • A brief description (about 200 characters or less) of the dataset, highlighting its relevance to affective computing.
  • A link to the dataset's homepage. The dataset's published paper should be clearly and easily accessible from this homepage.

For any problems such as broken links or requests to update or remove your content, email help@aaac.world

Dataset Name Brief Description
CARMA CARMA is a media annotation program that collects continuous (moment-by-moment) ratings while displaying audio and video files. It is designed to be highly user-friendly and easily customizable, and it supports annotation using a computer mouse, keyboard, or joystick.
DARMA DARMA is a media annotation program that collects continuous (moment-by-moment) ratings while displaying audio and video files. It is designed to be highly user-friendly and easily customizable, and it supports annotations using a joystick.

 

Share your affective computing software with the AAAC community

Register your affective computing software with us, improve its visibility, and share with the AAAC community. Simply email contact@aaac.world with the following information:

  • A brief description (about 200 characters or less) of the software or source code that also highlights its relevance to affective computing.
  • A link to the software or source code homepage. Any published papers associated with the software/code should be clearly and easily accessible from this page.

For any problems such as broken links or requests to update or remove your content, email help@aaac.world

Course Name Brief Description
Affective Computing Authored by Prof. Jainendra Shukla (IIIT Delhi) and Prof. Abhinav Dhall (IIT Ropar). Provides an overview of emotion theory, computational modeling of emotions, multimodal emotion analysis, related machine learning and/or signal processing techniques, and ethical implications.
Affective Computing: An Interdisciplinary Approach Authored by Prof. Jonathan Gratch (USC). Provides a comprehensive and interdisciplinary introduction to Affective Computing: i.e., computing that relates to, arises from, or deliberately influences emotions. Suitable for non-computer science students with some familiarity with computational methods.

 

Share your affective computing course with the AAAC community

Register your affective computing course with us, improve its visibility, and share with the AAAC community. Simply email contact@aaac.world with the following information:

  • A brief description (about 200 characters or less) of the course or course materials, including its relevance to affective computing. Both individual lectures and full curricula are acceptable.
  • A link to the course content homepage or download page. The content should have been used for outreach or to teach students at least once and the instructor should have solicited feedback. If feedback was poor, we kindly request that you wait to register the course here until an improved future version is available.

For any problems such as broken links or requests to update or remove your content, email help@aaac.world