Affective Agents in AR/VR


Overview

This area focuses primarily on generating virtual agents with appropriate emotional expressiveness for a variety of human-agent interactions in social contexts, such as co-habiting and navigating in the same space, conducting conversations, and engaging human audiences.


Publications

Project Conference/Journal Year
Speech2AffectiveGestures: Synthesizing Co-Speech Gestures with Generative Adversarial Affective Expression Learning ACMMM 2021
Text2Gestures: A Transformer-Based Network for Generating Emotive Body Gestures for Virtual Agent IEEE VR 2021
Generating Emotive Gaits for Virtual Agents Using Affect-Based Autoregression ISMAR 2020
FVA: Modeling Perceived Friendliness of Virtual Agents Using Movement Characteristics ISMAR 2019
Modeling Data-Driven Dominance Traits for Virtual Characters using Gait Analysis IEEE TVCG 2019
Pedestrian Dominance Modeling for Socially-Aware Robot Navigation ICRA 2019
F2FCrowds: Planning Agent Movements to Enable Face-to-Face Interactions PRESENCE 2017
Generating Virtual Avatars with Personalized Walking Gaits using commodity hardware ACM Multimedia 2017
PedVR: Simulating Gaze-Based Interactions between a Real User and Virtual Crowds VRST 2016