Emotion Recognition


This area focused primarily on developing techniques for emotion recognition from multiple modalities such as face, speech, and body expressions. This leads to a variety of applications including fake media detection, understanding cognitive engagement, and building socially-aware robots for navigation and interaction with humans.


Project Conference/Journal Year
INTENT-O-METER: Determining Perceived Human Intent in Multimodal Social Media Posts using Theory of Reasoned Action under review 2023
Video Manipulations Beyond Faces: A Dataset with Human-Machine Analysis WACV-W 2023
Show Me What I Like: Detecting User-Specific Video Highlights Using Content-Based Multi-Head Attention ACMMM 2022
3MASSIV: Multilingual, Multimodal and Multi-Aspect dataset of Social Media Short Videos CVPR 2022
Multimodal Emotion Recognition using Transfer Learning from Speaker Recognition and BERT-based models Odyssey 2022
Learning Unseen Emotions from Gestures via Semantically-Conditioned Zero-Shot Perception with Adversarial Autoencoders AAAI 2022
DeepTMH: Multimodal Semi-Supervised Framework Leveraging Affective and Cognitive Engagement for Telemental Health arXiv 2021
HighlightMe: Detecting Highlights from Human-Centric Videos ICCV 2021
Affect2MM: Affective Analysis of Multimedia Content Using Emotion Causality CVPR 2021
Dynamic Graph Modeling of Simultaneous EEG and Eye-tracking Data For Reading Task Identification ICASSP 2021
Emotions Don’t Lie: A Deepfake Detection Method using Audio-Visual Affective Cues ACM Multimedia 2020
Take an Emotion Walk: Perceiving Emotions from Gaits Using Hierarchical Attention Pooling and Affective Mapping ECCV 2020
EmotiCon: Context-Aware Multimodal Emotion Recognition using Frege’s Principle CVPR 2020
M3ER: Multiplicative Multimodal Emotion Recognition Using Facial, Textual, and Speech Cues AAAI 2020
STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits AAAI 2020
EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and Gaze ACM SAP 2019
Identifying Emotions from Walking using Affective and Deep Features arXiv 2019
The Emotionally Intelligent Robot: Improving Social Navigation in Crowded Environments IROS 2019
Data-Driven Modeling of Group Entitativity in Virtual Environments VRST 2018
Classifying Group Emotions for Socially-Aware Autonomous Vehicle Navigation CVPR Workshop 2018
Aggressive, Tense or Shy? Identifying Personality Traits from Crowd Videos IJCAI 2017
Sociosense: Robot navigation amongst pedestrians with social and psychological constraints IROS 2017