Our work “Cross-dataset emotion recognition from facial expressions through convolutional neural networks” has just been published in the
Journal of Visual Communication and Image Representation ( #JVCI ).
This work sheds some light on fundamental questions regarding emotion recognition from facial expressions. Is it possible to reliably infer someone’s internal state based only on their facial muscles’ movements? Is there a universal facial setting to express basic emotions such as anger, disgust, fear, happiness, sadness, and surprise? We particularly examine whether characteristics learned from one group of people can be generalized to predict another’s emotions successfully.
We achieve state-of-the-art results by employing convolutional neural networks and data visualization techniques, outperforming even commercial off-the-shelf solutions from well-known tech companies under a cross-dataset protocol.