Summary: Our ability to recognize emotion in others depends upon the speed at which we process facial expressions.
Source: University of Birmingham
The speed at which we produce facial expressions plays an important role in our ability to recognise emotions in others, according to new research at the University of Birmingham.
A team in the University’s School of Psychology carried out research which showed that people tend to produce happy and angry expressions more rapidly, while sad expressions are produced more slowly.
The team found that our ability to form judgements about people’s facial expressions has close links with the speeds at which those expressions are produced and is also closely related to the ways in which we would produce those expressions ourselves. The study is published in Emotion.
“Being able to recognise and interpret facial expressions is a vital part of social interaction,” explained lead author Dr Sophie Sowden. “While we understand the spatial characteristics of an expression – the way the mouth moves in a smile, for example – the speeds at which expressions are produced are often overlooked. The ability to pick up on and rapidly interpret these cues could also help people to judge facial expressions even when mask-wearing might limit other visual cues.”
Dr Sowden added: “Better understanding how people interpret this important visual cue, could give us new insights into the diagnosis of conditions such as Autism Spectrum Disorder or Parkinson’s Disease. This is because patients with these conditions often recognise facial expressions differently, or exhibit expressions differently.”
In the study, the team asked people to create facial expressions directed at a camera, and used an opensource software programme called OpenFace to track their facial movement. They measured the speed of movement in regions of the face known to be important in producing expression, including around the eyebrows, the nose and the mouth, as well as across the face as a whole.
In the first part of the experiment, the researchers investigated the average speed at which participants produced different expressions. They were asked to produce ‘posed expressions’, as well as expressions during speech, and spontaneous expressions were recorded in response to emotion-inducing videos. Interestingly, they showed differences in speed across emotions depends on the region of the face and the ‘type’ of expression being considered.
In a second phase of the study, the team investigated what would happen if they captured schematic versions of facial expressions being produced, and manipulated the speeds involved. In this experiment, the researchers found that as the act of expression was speeded up, people would get better at recognising it as happy or angry, whereas if it was slowed down, people would more accurately identify it as sad.
As well as being important for early diagnosis of autism and Parkinson’s disease, the researchers believe the work could also be useful in a range of artificial intelligence applications such as facial recognition software.
About this psychology research news
Source: University of Birmingham
Contact: Beck Lockwood – University of Birmingham
Image: The image is in the public domain
Original Research: Open access.
“The Role of Movement Kinematics in Facial Emotion Expression Production and Recognition” by Sowden et al. Emotion
The Role of Movement Kinematics in Facial Emotion Expression Production and Recognition
The kinematics of peoples’ body movements provide useful cues about emotional states: for example, angry movements are typically fast and sad movements slow. Unlike the body movement literature, studies of facial expressions have focused on spatial, rather than kinematic, cues.
This series of experiments demonstrates that speed comprises an important facial emotion expression cue. In Experiments 1a–1c we developed (N = 47) and validated (N = 27) an emotion-induction procedure, and recorded (N = 42) posed and spontaneous facial expressions of happy, angry, and sad emotional states.
Our novel analysis pipeline quantified the speed of changes in distance between key facial landmarks. We observed that happy expressions were fastest, sad were slowest, and angry expressions were intermediate. In Experiment 2 (N = 67) we replicated our results for posed expressions and introduced a novel paradigm to index communicative emotional expressions.
Across Experiments 1 and 2, we demonstrate differences between posed, spontaneous, and communicative expression contexts. Whereas mouth and eyebrow movements reliably distinguished emotions for posed and communicative expressions, only eyebrow movements were reliable for spontaneous expressions.
In Experiments 3 and 4 we manipulated facial expression speed and demonstrated a quantifiable change in emotion recognition accuracy. That is, in a discovery (N = 29) and replication sample (N = 41), we showed that speeding up facial expressions promotes anger and happiness judgments, and slowing down expressions encourages sad judgments. This influence of kinematics on emotion recognition is dissociable from the influence of spatial cues.
These studies demonstrate that the kinematics of facial movements provide added value, and an independent contribution to emotion recognition.