Overview
We present novel algorithms for identifying emotion, dominance, and friendliness characteristics of pedestrians, as well detecting deceptive traits in walking, based on their motion behaviors. We also propose models for conveying emotions, friendliness, and dominance traits in virtual agents. We present applications of our algorithms to simulate interpersonal relationships between virtual characters, facilitate socially-aware robot navigation, identify perceived emotions from videos of walking individuals, and increase the sense of presence in scenarios involving multiple virtual agents. We also present a dataset of videos of walking individuals with gaits and labeled emotions.
Currently, our efforts are focused on predicting perceived emotions from multiple modalities such as faces, gaits, speech, and text, by investigating the correlation between these modalities. This direction will also lead us to be able to infer or generate missing modalities.
We are also working on developing artificial intelligent tools to improve mental health diagnosis for telehealth behavioral health services during COVID-19. More details on this research here.