Symposium

Neurorobotic models of emotion

July 11th Room 4
Marwen Belkaid (Sorbonne Université)

Neurorobotics has emerged as the scientific field interested in embodied neural systems. Inherently interdisciplinary, this line of research mixes methods and techniques from computational neuroscience, machine learning and robotics. It serves two complementary goals: designing efficient machines inspired by natural cognition, and understanding the brain through embodied intelligent machines. There have been significant interest and effort in the modeling of emotion-related phenomena in the field. Mirolli and colleagues (2010) proposed a detailed model of the affective regulations of the body, brain, and behavior mediated by the amygdala. Krichmar (2013) used an architecture mimicking neuromodulatory interactions between dopamine and serotonin to control the anxious and curious behaviors of a robot. Khan and Cañamero (2018) modeled the role of the oxytocin hormone in behavior adaptation during social interactions through touch. Boucenna and colleagues (2014) showed the emergence of facial expression recognition and social referencing in a robot by reproducing parent-infant interaction. Belkaid and colleagues (2018) highlighted the importance of modeling emotional modulation of behavior to foster autonomy in robotic cognitive architectures. This symposium aims to bring together researchers interested in how neurorobotic models of emotion can help understand human emotions and improve robots autonomy and social capabilities. Of particular interest are models that investigate the mechanisms underlying the emergence of emotion in natural and artificial organisms.

Emotion as a collection of metacontrol mechanisms in natural and artificial systems 

Marwen Belkaid

Natural and artificial organisms engage a variety of processes to control their interactions with the environment. From a computational perspective, these first order processes are governed by a set of parameters which determine their functioning. On the other hand, metacontrol refers to the mechanisms that modify these parameters, allowing the system to adapt its behavior to different situations. To illustrate this, I will show results suggesting that mice exhibit a type of metacontrol in a decision-making experiment. I will also present robotic experiments in which metacontrol mechanisms help solve the task. I will argue that metacontrol is one the major features of emotion and that this conceptual framework is useful to understand emotional phenomena and to guide the design of cognitive architectures for autonomous machines.

Embodying Affect in Autonomous Interactive Robots 

Lola Canamero

One of the key contributions that robots can make to emotion research is the possibility to implement, test, extract and analyze assumptions and consequences, assess the scope and "usefulness" of different conceptualizations, models and theories of emotions. Beyond computational models, robots permit to investigate how theoretical models "behave" when they are situated in, and in interaction with, real (versus simulated) physical and social environments. Since 1995, in my research, I have taken a "strong" approach to embodiment that models an "internal" as well as "external" affective embodiment. In this talk, I will argue that such affective embodiment can provide grounding for various cognitive and interactional skills that autonomous and social robots need, and how it can shed light towards understanding affect in humans. My specific model is biologically-inspired and builds on a synthetic physiology regulated using principles stemming from embodied AI, cybernetics (e.g., homeostasis and allostasis), ethology, neuroscience and dynamical systems. It provides a blueprint for a "core affective self" that endows robots with internal values (their own or acquired) and motivations that drive adaptation and learning through interactions with the physical and social environment. I will illustrate how such model has been used in my group over many years to ground a broad range of affective, cognitive and social skills such as adaptive decision making (action selection), learning (e.g., self-learning of affordances, learning to explore novel environments, learning to coordinate social interaction), development (e.g., development of attachment bonds, epigenetic development), and social interaction (e.g., robot companions for children).

Robots with a ‘heart’ 

Marco Mirolli, Luca Simione & Domenico Parisi

Human emotions can be considered as the mental manifestations of physical events happening inside the body: that’s why we say that emotions are ‘felt’. Current computational models of emotions, including robotic ones, are usually not ‘embodied’ enough: the control system of current robots interact with the external environment through external sensors and actuators, but there is typically no internal environment to interact with. In sharp contrast to this, animals, including humans, have not only an external environment, but also a quite complex internal one, which includes organs and systems with which their control systems, i.e. their brains, interact: they continuously receive information from internal sensors (e.g. regarding hunger, temperature, heartbeats, hormonal and immune states…) and send efferent outputs that modify the body (e.g. blood vessels, respiration and heart rate, circulating hormones…). Emotions crucially depend on these interactions between the brain and the internal environment. To investigate these interactions, we developed an evolutionary robotics model in which robots have a simulated ‘heart’: an internal state which influences the amount of energy that arrives at the actuators and hence the speed with which the robot moves. Our robots live in an environment with foods, predators, and sexual partners and hence they have to orchestrate between different motivations while also appropriately dosing their energy, as if it finishes, they die. Our simulations represent a first attempt to study the relationships between the brain and the internal environment in the context of robotics and hence to develop robots that can really feel emotions.

Emotional interaction as a way to regulate robot behavior 

Sofiane Boucenna

In this study, we study how emotional interactions with a social partner can bootstrap increasingly complex behaviors such as social referencing. Our idea is that social referencing as well as facial expression recognition can emerge from a simple sensory-motor system involving emotional stimuli. Without knowing that the other is an agent, the robot is able to learn some complex tasks if the human partner has some ”empathy” or at least ”resonate” with the robot head (low level emotional resonance). Hence, we advocate the idea that social referencing can be bootstrapped from a simple sensory-motor system not dedicated to social interactions.