Lade, P and Balasubramanian, Vineeth N and Sethuraman, Panchanathan
(2013)
Latent Facial Topics for affect analysis.
In: IEEE International Conference on Multimedia and Expo Workshops (ICMEW), 15-19 July 2013, San Jose, CA, USA.
Full text not available from this repository.
(
Request a copy)
Abstract
Recent years have seen a growing need in the affective computing community to understand an emotion space beyond the seven basic expressions, leading to explorations of an emotion space continuum spanned by dimensions such as valence and arousal. While there has been substantial research in the identification of facial Action Units as building blocks for the basic expressions, there is a new need to discover fine-grained facial descriptors that can explain the variations in the continuum of emotions. We propose a methodology to extract Latent Facial Topics (LFTs) from facial videos, by adapting Latent Dirichlet Allocation and supervised Latent Dirichlet Allocation topic models for facial affect analysis. In this work, we study the application of topic models to both discrete emotion recognition as well as continuous emotion prediction tasks. We show that meaningful and visualizable LFTs can be extracted and used successfully for emotion recognition. We report our recognition results on the widely known Cohn Kanade Plus and AVEC 2012 FCSC challenge data sets, which have shown promise for both discrete and continuous emotion recognition problems.
Actions (login required)
|
View Item |