Identifying Emotions from Walking using Affective and Deep Features


Abstract

We present a new data-driven model and algorithm to identify the perceived emotions of individuals based on their gaits. Using affective features computed using psychological findings and deep features learned using LSTM we classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral with an accuracy of 74.10%. We also present an "EWalk (Emotion Walk)" dataset that consists of videos of walking individuals with gaits and labeled emotions. To the best of our knowledge, this is the first gait-based model to identify perceived emotions from videos of walking individuals.

Video

Paper

Identifying Emotions from Walking using Affective and Deep Features, arXiv 2019.
Tanmay Randhavane, Aniket Bera, Kyra Kapsakis, Rahul Sheth, Kurt Gray, and Dinesh Manocha

@article{randhavaneidentifying,
  title={Identifying Emotions from Walking using Affective and Deep Features},
  author={Randhavane, Tanmay and Bera, Aniket and Kapsaskis, Kyra and Sheth, Rahul and Gray, Kurt and Manocha, Dinesh}
}