The neuropeptide oxytocin plays a prominent role in social and emotional cognition. Findings suggest that exogenous intranasal oxytocin administration facilitates emotion recognition in humans, but individual and contextual differences may have moderating effects. A major caveat in this line of work is that it is predominantly based on young males, which limits current knowledge and potential for generalizability across gender. To uncover potential gender effects, the present study included younger and older men and women. Utilizing a randomized, double-blind, placebo-controlled, within-subjects study design, we investigated the effects of a single-dose of 40 IUs intranasal oxytocin administration on emotion recognition of dynamic positive and negative stimuli in 32 men (mean age 45.78, sd. 22.87) and 39 women (mean 47.87, sd. 47.87), 40 minutes prior to MRI scanning. Preliminary analyses show that oxytocin induced brain activity reductions during exposure to negative (relative to positive) stimuli in women, while increasing brain activity in dorsomedial prefrontal cortex in men. The findings suggest that the effects of oxytocin on emotion recognition may be related to emotion regulation and mentalization processes, and that oxytocin is related to potential sex-differences in these processes. The results also raise concern that previous oxytocin literature on emotion recognition may be biased as there appears to be gender-differential effects of oxytocin on brain activity across adulthood that have been underestimated. In the next stage of the present study, we will investigate the interaction effects among treatment, sex, age, and presentation modality.
In his influential 1884 paper ‘What is an emotion?’, William James famously claimed that if we imagine a ‘full’ emotion episode and subtract all the bodily feelings that it involves, there is no emotion left. The conclusion that is extracted from this thought experiment is well known: emotions are nothing but perceptions of one’s own bodily changes. This is known as the “subtraction argument”, and it is still a central argument in contemporary neo-Jamesian accounts of emotion (Deonna and Teroni, 2017; Prinz, 2003). The other main line of evidence for the theory, which involve the study of cases of impaired interoception (Laird, 2014), is inconclusive, as it is argued that the brain can simulate the feeling of bodily changes in the absence of actual bodily changes (Damasio, 1994). Here, I present results from a pre-registered study which show that, contrary to James’ intuitions in the subtraction argument, most participants in the study (77%) considered that their emotions would persist after subtracting the bodily feelings from it. Participants’ responses were independent of individual differences in interoception (PBCS) and cognitive reflection (CRT). Apart from showing the invalidity of the subtraction argument, I argue that these results regarding people’s understanding of emotion could be relevant for the question of whether bodily feelings are actually necessary for emotion. My main argument is based on the fact that studies investigating the bodily correlates of emotion largely depend on participants’ emotional self-reports, and these in turn depend on participants’ understanding of emotion.
Forming an accurate spatial representation of the world is essential for survival. Yet, the extent to which a nasal whiff of scent can automatically orient visual spatial attention remains poorly understood in humans. In a series of seven studies, we investigated the existence of an automatic capture of visual attention by trigeminal stimuli (i.e., CO2 and eucalyptol). We choose these stimuli because they are particularly prone to automatically capture attention. We used them as lateralized cues in a variant of a visual spatial cueing paradigm. In valid trials, trigeminal cues and visual targets were presented on the same side whereas in invalid trials they were presented on opposite sides. To characterize the dynamics of the cross-modal attentional capture, we manipulated the interval between the trigeminal cues and the visual targets (from 580 to 1870 ms). Reaction times in trigeminal valid trials were shorter than all other trials, but only when this interval was around 680 or 1170 ms with CO2 and around 610 ms for eucalyptol. This result reflects that olfactory trigeminal stimuli can automatically capture humans’ visual attention. We discussed the importance of considering the dynamics of this cross-modal attentional capture.
Emotional stimuli are known to be interpreted quickly and automatically, even if they are not consciously seen. It is, however, still debated how specific (i.e., valence or more) and based on what processes these stimuli can be processed under masked presentation conditions. We developed an implicit behavioral measure, the emotion misattribution procedure (Rohr, Degner, & Wentura, 2015) - an adaptation of the affect misattribution procedure - to investigate these questions. Participants’ task is to classify neutral target faces according to the allegedly shown emotion category (joy, fear, anger, or sadness). Preceding these neutral targets, clearly emotional prime stimuli are presented. With this paradigm, we have already shown that masked emotional faces can lead to masked misattribution effects beyond valence (Rohr, Degner, & Wentura, 2015). Moreover, we found evidence that physiological processes, as indexed by facial muscle responses, influence the choice of the behavioral response (Rohr, Folyi, & Wentura, 2018). Building up on this earlier work, the present research investigates the involvement of physiological processes in more detail. To this aim we assessed facial muscle activity in addition to behavioral responses in the masked misattribution task with five emotions, that is, participants had to decide which of five emotions (i.e., joy, anger, fear, disgust, sadness) would be allegedly shown by a neutral target face. Results show that the masked primes trigger behavioral as well as facial muscle responses. The pattern of results is most compatible with a sensorimotor simulation view.
Pavlovian aversive conditioning is a fundamental form of learning that helps organisms survive in their environment. Past research has suggested that organisms are predisposed to preferentially learn to fear stimuli that provided threats to survival through the evolution of the species. Here, we sought to determine whether stimuli that are relevant to the organism’s concerns beyond biological and evolutionary considerations can also be preferentially conditioned to threat, and whether such preferential learning is modulated by inter-individual differences in affect and motivation. To do so, we experimentally manipulated the goal-relevance of initially neutral stimuli in a spatial cueing task, and subsequently used them as conditioned stimuli in a differential Pavlovian aversive conditioning paradigm, while examining the influence of participants’ achievement motivation thereon. Results indicate that achievement motivation modulated Pavlovian aversive learning to goal-relevant versus goal-irrelevant stimuli. Participants with high achievement motivation more readily acquired a conditioned response to goal-relevant relative to goal-irrelevant stimuli, thereby reflecting a learning bias, than did participants with lower achievement motivation. Taken together, these findings suggest that stimuli that are relevant to the organism can induce faster Pavlovian aversive conditioning despite holding no inherent biological evolutionary significance, and that the occurrence of such learning bias hinges upon inter-individual differences in the organism’s concerns, such as achievement motivation.
Background: Every day exposure to emotional situations of varied intensities ranges from pleasant to traumatic in nature. To form an appropriate response under such situations, emotional processing need to be prioritized which in turn may affect the ongoing cortical network activities. The neural underpinnings of such interactions of emotional processing with resting state networks (RSNs) is still unclear. Objective: To investigate the cortical sources during emotional processing using quantitative EEG. Methods: Healthy subjects (n=60; Mage = 26.7±3.0 yrs) were exposed to high arousing (negative and positive) and low arousing (neutral) pictures from International affective picture system (IAPS). High density 128 channel EEG was recorded during picture viewing and the cortical sources were calculated using sLORETA in seven frequency bands defined by individual alpha frequency (IAF). The sources in each band and condition were compared with their respective baseline eyes open condition using statistical nonparametric mapping method (SnPM). Result: Negative emotions caused early activation of insula and suppression of default mode network (DMN) in upper alpha (UA) band along with activation of pain matrix in gamma band. Positive emotion showed early suppression of DMN in UA band and late activation of insula while for Neutral emotion early activation of DMN in delta band and suppression of insula was observed. Conclusion: Emotional conditions can be distinguished on the basis of frequency specific activation of cortical networks at different time windows. This knowledge could be used to compare between normal and altered emotional profiles in borderline mood disorders and neurological & psychiatric disorders.
Inherently neutral or unfamiliar objects have been shown to acquire gain- or loss-associations through learning, leading to changes in behavioral and neural responses to these objects. However, the mechanisms underlying these learning processes are still unclear. The current study tested 24 participants in a learning paradigm, in which pseudowords were associated with monetary gain, monetary loss or no outcome (8 pseudowords per category). In a learning session, participants had to classify which outcome category each of the pseudowords belonged to through a manual response. In a test session on the subsequent day, participants were asked to distinguish the previously associated words from novel distractors in an old/new decision task. The design was fully counterbalanced. Event-related potentials (ERPs), pupil diameter and behavioral responses were measured during the learning and test session. Changes in these measures over time were investigated using a moving-window technique. The models show that, on the behavioral level, associations with gain are learned fastest, differing from loss and neutral associations in the first half of the learning session, while all associations were equally well learned during the second half of the learning phase. On the neural level, P1 amplitude in ERPs also differed between gain-associated and neutral stimuli early during the learning session. These findings suggest that gain association facilitates learning behavior and the related neural responses already very early during learning, possibly related to an attentional enhancement of reward-related stimuli.
Individuals with autism spectrum disorder (ASD), who are characterized by deficits in social communication and interaction, often struggle with quick and adequate facial emotion recognition. Considering the abundant mixed behavioral results on emotion processing in ASD, we combined fast periodic visual stimulation with electroencephalography to examine the implicit neural sensitivity of school-aged boys with ASD versus matched controls to brief changes in facial expression. By periodically presenting neutral faces at 6 Hz and expressive faces at 1.2 Hz, we can determine the implicit neural sensitivity for expressive face detection in between a stream of neutral faces by quantifying the periodic brain response at the oddball frequency. A fixation cross orienting the participants’ attention towards the eyes or mouth allows us to investigate which facial feature enhances rapid emotion detection and whether this differs between ASD and controls. The employed processing styles (feature-based/global) for expression detection were examined by presenting the faces upright and inverted. Results show a lower neural sensitivity in ASD for fearful and angry faces, suggesting a rather specific emotion detection deficit in ASD instead of a general deficit. Reduced responses for inverted faces indicate that inversion affects emotion detection in both groups, hampering global processing. Both groups benefited most from the mouth information to rapidly detect emotional faces. We will complement these results with eye tracking data of two explicit tasks to examine whether the mouth indeed is the most important facial feature for emotion processing and whether this differs for implicit/explicit or low-level/higher-level expression processing.
Updating is an executive function responsible for monitoring of working memory representations for relevance to the task and replacing them when necessary. It has been shown that participants’ emotional traits are related to the efficiency of the emotional stimuli updating in accordance with the principle of emotional congruence. However, relationships between mood and emotional updating has not been studied. The study aimed at examining the role of participants’ mood and their emotion regulation strategies in updating emotional stimuli. Updating was measured with affective versions of the n-back task using emotional words and facial expressions as stimuli. Positive, negative, and neutral moods were induced using autobiographical memories and IAPS pictures. In Study 1, we hypothesized that a participant’s mood would facilitate updating of emotional stimuli of the same valence. In all experimental conditions, updating of emotional stimuli was more efficient compared to neutral stimuli. In both happy and sad moods, emotion stimuli were updated more efficiently, compared to neutral states. In Study 2, we expected a positive relationship between the impact of negative mood on affective updating and emotion regulation. We found that there was a positive correlation between affective updating and reappraisal strategy of emotion regulation. However, mood impact on updating was not related to emotion regulation. The results show that mood and emotion regulation impact updating of emotional stimuli, but their influence is not necessarily consistent with the principle of emotion congruence. The issue of causal relationships between mood, updating, and emotion regulation should be addressed in further studies.