Seeing an emotion expressed in another person’s face initiates a range of following processes of lower and higher order. The current symposium addresses such emotion processing, particularly phenomena such as automatic facial responses to viewing facial emotional expressions and afterimages, as well as the detection and identification of observed emotion. These processes can be investigated with various methods of behavioural and biological nature, which will be considered in this symposium. The symposium includes empirical studies using psycho-physiological measures (e.g. facial electromyography) next to behavioural measures. Research on emotion processing from observed faces is relevant to many clinical populations, since these are characterised by deficits in the mentioned processes. Thus, research on emotion processing from faces in clinical populations aims at deepening our understanding of clinical conditions. Moreover, this research can lead to a better understanding of the functioning of emotion processing in the typical population. The symposium therefore features research on typical and atypical populations (e.g. autism spectrum disorders, post-traumatic stress disorder/trauma). Contributors: Dr Tanja S.H. Wingenbach: ‘Distinct automatic facial muscle response patterns to various observed facial emotional expressions’, Dr Sylwia Hyniewska (email@example.com): ‘Affective decoding in naturalistic settings’, Dr Chris Ashwin (University of Bath, firstname.lastname@example.org): ‘Differences in perceptual aftereffects to angry and happy facial expressions of emotion in adults with autism’, Prof Monique C. Pfaltz (University Hospital Zurich, Monique.Pfaltz@usz.ch): ‘Negativity bias in recognition of neutral facial expressions in individuals with PTSD and child maltreatment’. Finally, Prof Wataru Sato (Kyoto University; email@example.com) will serve as discussant of the symposium by talking about: ‘Typical and atypical detection of emotional facial expressions’ and relating the presented research findings of all speakers to each other.
Individuals with Borderline Personality Disorder who show a high prevalence of trauma and child maltreatment tend to see neutral facial expressions as negative. Here, we explored whether individuals with posttraumatic stress disorder (PTSD) show a negative bias in recognition of neutral expressions and whether child maltreatment is linked to this bias. Methods: Thirty-nine PTSD participants, 44 traumatized (TC), and 35 non-traumatized healthy controls (HC) watched 300 one-second movies showing neutral and emotional expressions, and indicated whether a neutral or one of 9 emotional facial expressions were presented. Results: PTSD individuals performed more poorly than TC and HC in recognizing neutral expressions. They misinterpreted neutral expressions more often as anger and contempt than HC and TC. Comparisons of statistical model fits suggest that childhood maltreatment, especially sexual abuse, play a more important role for recognition of neutral expressions than a diagnosis of PTSD. Higher levels of self-reported child maltreatment were linked to more pronounced misinterpretations of neutral expressions as anger, fear, sadness and contempt. We are currently conducting a separate study in individuals with and without child maltreatment to assess whether our findings hold in study groups defined by the presence or absence of childhood maltreatment, rather than by the presence or absence of PTSD. Conclusion: Traumatic experiences, especially in childhood, may shape the interpretation of neutral facial expressions. A negative response bias for neutral expressions may lead to interpersonal problems, augmented feelings of threat and to a negative self-image.
That watching facial emotional expression leads so subtle facial muscle activation in observers in line with the valence of the observed expressions is a well-documented phenomenon (‘facial mimicry’). That is, published research on automatic facial mimicry has mostly included the corrugator and zygomaticus muscles when investigating subtle responses to observed facial emotional expressions allowing for valence-based differentiation between emotion categories. Evidence for facial mimicry being an emotion-specific response is rather limited as the few published studies including other muscles have shown inconclusive results. The current study included five facial muscle sites (corrugator, zygomaticus, levator, depressor, frontalis) and 10 emotion categories (anger, disgust, fear, sadness, surprise, happiness, pride, embarrassment, contempt, neutral) to investigate whether ‘facial mimicry’ is specific to the observed emotion category. Facial electromyography (EMG) from 84 participants was recorded. Due to overlapping facial muscle activation for different emotion categories (e.g. corrugator for negative valence emotions), the EMG responses per muscle site were combined to response patterns across muscles. These pattern for each emotion category were contrasted to all other emotion categories included in the study. Results showed that after correction for multiple comparisons most emotion categories had distinct facial EMG response patterns within and across valence categories. The current study thus demonstrates that facial mimicry is an emotion-specific phenomenon and not simply valence-based.
It has been argued that we decode others’ emotional states and/or associated appraisals of ongoing events from particular sets of facial action units (AUs). Rare are the studies, however, to have systematically tested the relationships between the decoding of emotions/appraisals and sets of AUs. The results reported so far are mixed. Furthermore, no study analysed the decoding of spontaneous facial behaviour observed in naturalistic settings. We asked participants (N = 122) to judge facial expressions filmed unobtrusively in real-life situations. Participants were asked to decode emotions (e.g. anger) and appraisals (e.g. suddenness). The AUs observed in the videos were annotated by certified experts using the Facial Action Coding System. We explored the relationships between the emotion/appraisal decoding and AUs using stepwise multiple regression analyses. The results revealed that all the rated emotions and appraisals were associated with sets of AUs. The profiles of regression equations showed AUs both consistent and inconsistent with those in theoretical proposals. The results confirm our hypothesis that the decoding of emotions and appraisals in facial expressions is implemented by the perception of sets of AUs. It seems however that the profiles of such AU sets could be different from previous theoretical suggestions.
Autism spectrum disorders (ASD) are characterised by difficulties in social interaction and communication alongside repetitive and restricted behaviour. The social difficulties often include problems identifying and understanding the emotional states of others. Emotion adaptation paradigms are one approach to investigate the perception of facial expressions, which involves presenting a facial expressions image for an extended period of time followed by the presentation of a protypical face without any expression. People typically perceive an after-effect when judging protypical faces which appear as the opposite valence to the adapted expression. For example, if adapted to a happy expression people typically perceive a negative valence facial expression after-effect. The present study aimed to test emotional adaptation to negative and positive valence facial expressions. We recruited 21 adults with ASD and 21 adult control participants who all completed a baseline emotion recognition task followed by an emotion adaptation task, which included typical angry or happy expressions as well as 100% and 50% anti-images. Results showed the ASD group had poorer recognition of both angry and happy expressions at baseline, but no differences between groups were found for valence judgements of facial emotion after-effects to typical angry or happy expressions. However, the ASD group showed atypical perceptual after-effects when they were adapted to both angry and happy anti-images, with greater perception than controls of negative after-effects for anti-angry expressions and reduced after-effects compared to controls to anti-happy images. The results show strengths and difficulties in perceptual after-effects to anti-image expressions in adults with ASD.