Using Automated Facial Expression Recognition Technology to Distinguish Between Cortical and Subcortical Facial Motor Control



Outcome
: Researchers at UC San Diego, University at Buffalo, and University of Toronto, have developed a computer vision system that distinguishes faked from genuine facial expressions of pain. The system outperformed human observers, who had at most a 55 percent success rate, even with training, whereas a computer vision and pattern recognition system was accurate about 85 percent of the time.

Impact/benefits: This finding demonstrates that genuine and posed facial expressions can be distinguished from their dynamic signatures. Thus its possible to distinguish two different types of neural control of the face: A cortical system that controls deliberate facial movements, and a subcortical system that controls spontaneous facial movements.

Background/Explanation: In highly social species such as humans, faces have evolved to convey rich information for social interaction, including expressions of emotions and pain. Two pathways in the brain control facial movement. One pathway originates in the cortex and controls deliberate facial expressions. Another pathway originates in deeper, subcortical areas and drives spontaneous facial expressions of felt emotions. The cortical system enables humans to simulate facial expressions of emotions not actually experienced. Their simulation is so successful that they can deceive most observers. Machine vision may, however, be able to distinguish deceptive from genuine facial signals by identifying the subtle differences between cortically and subcortically driven movements. Here we show that human observers could not discriminate real from faked expressions of pain better than chance, and after training, improved accuracy to a modest 55%. However a computer vision system that automatically measures facial movements and performs pattern recognition on those movements attained 85% accuracy. The machine system's superiority is attributable to its ability to differentiate the dynamics of genuine from faked expressions. Thus by revealing the dynamics of facial action through machine vision systems, this approach has the potential to elucidate behavioral fingerprints of neural control systems involved in emotional signaling.

 

Bartlett, M., Littlewort, G., Frank, M., and Lee, K. (2014). Automatic Decoding of Facial Movements Reveals Deceptive Pain Expressions. Current Biology. 2014 Mar 31;24(7):738-743. doi: 10.1016/j.cub.2014.02.009. DOI: http://dx.doi.org/10.1016/j.cub.2014.02.009. Epub 2014 Mar 20. Link to article