Phil/Psych 446, Cognitive Modelling, Week 9

Emotional cognition

Why has Cognitive Science tended to ignore emotions?

Many philosophers, AI researchers, and even psychologists have taken emotions to be peripheral to cognition.

Most philosophers and AI researchers have believed that emotional thought is inferior to rational thought.

Emotions involve feelings and conscious, which are hard to study scientifically.

Some research on emotions

The Emotion Home Page

MIT Affective computing laboratory

University of Birmingham Cognition and Affect Project

The Geneva Emotion Research Group

Kismet: A robot for social interactions

Why emotions are important to cognition

People's thinking is often emotional (diary studies by Keith Oatley and others).

Basic emotions (controversial list): happiness, sadness, anger, fear, disgust, surprise.

Thinking can have emotional outputs, e.g. thinking that you've won the lottery can make you happy.

Thinking can have emotional inputs, e.g. being happy at the thought of going to a hockey game may make you decide to go to the game.

Even stronger: emotion may permeate human thinking and be inseparable from it. Is this good or bad?

Antonio Damasio (Descartes' Error): people with brain damage that interferes with the effects of emotion on cognition make bad decisions.

Empathy is an important part of understanding other people.

Computational approaches to emotion

Goals of cognitive modelling of emotions:

Model reasoning about emotions.

Model the interactions of emotion and cognition.

Semantic networks

Add nodes to the network for happy, sad, etc.

Use associations as retrieval cues, as in Gordon Bower's idea of mood-dependent memory.

Limitations:

Rule-based systems

Construct rules to describe emotions, e.g.

Limitation: models reasoning about emotion, but not emotional reasoning.

Frame systems

Construct frames to represent typical emotions.

Happiness

Analogy

To a structure (such as an instance frame) representing a stored case or situation, attach an emotional tag. Then analogical inference can be used to transfer emotions from one situation to a similar one. For example, if you liked movie X, and movie Y is similar to X, then you will probably like Y.

Empathy is also analogical transfer of emotion: I understand how you are feeling by imagining myself in a similar situation and then transferring my emotional state to you.

Analogies can also be used to generate emotions, for example in jokes. For analogical jokes and further discussion, see P. Thagard and C. Shelley, "Emotional Analogies."

Dynamic systems

Instead of trying to model emotions verbally, use systems of equations to describe changes in emotional states.

HOTCO: A neural network model of emotional coherence

HOTCO stands for "hot coherence." It extends the coherence models described in week 8 to model emotional cognition.

The main description of HOTCO is in P. Thagard, Coherence in Thought and Action, MIT Press, 2000, ch. 6. A briefer account is in the articles "Emotional Analogies" and "Why OJ Wasn't Convicted" available here.

Structures

Elements (e.g. concepts, propositions, images) are represented by units in a localist neural network.

Units have excitatory and inhibitory links between them, and activations that represent the acceptability of the element that the unit represents.

But they also have valences, which represent positive and negative emotions.

Units can also have excitatory and inhibitory valence links, which represent emotional connections.

Procedures

Just as activations spread through a neural network, so do valences.

Valences spread along valence links, and also along activation links.

Overall emotional assessment is measured by the valence of a unit after the network has settled.

The valence of a unit j, Vj, is calculated by an equation very like the one used for updating activations (week 8):

Vj(t+1) = Vj(t)(1-d) + NETj(max - Vj(t)) if NETj > 0, otherwise NETj(Vj(t) - min).

Here d is a decay parameter (e.g .05) that decrements each unit at every cycle, min is a minimum valence (e.g. -1), max is maximum valence (e.g. 1).

Based on the weight Wij between each unit i and j, we can calculate NETj , the net valence input to a unit, by:

NETj = SUMi WijAi(t)Vi(t).

Note that this makes the valence input to a unit analogous to a calculation of expected utility (total valence) based on probability (activation) and utility (valence), although the calculation is done much more locally.

Applications

Intuition

Gut feelings are intuitions that emerge from emotional parallel constraint satisfaction.

Trust

The decision to trust or not to trust someone usually involves an emotional judgment based on explanatory, conceptual, and analogical coherence.

Emotional analogies

Empathy is a kind of analogical transfer that creates links between the source and the target that allows valences to spread.

Persuasion works the same way.

Metacoherence

Overall emotional reactions such as happiness, sadness, and surprise require activation of nodes that compute the general state of the network.

Happiness: activated units have positive valence.

Sadness: activated units have negative valence.

Surprise: units have undergone sharp changes in activation or valence.

Ethics of Artificial Intelligence

If computers can think, should we let them?

GAGE: Spiking neural network model of cognitive-affective integration.

See Wagar & Thagard (2004).

Key brain areas:

Prefrontal cortex: responsible for reasoning.
Ventromedial PFC: connects input from sensory cortices with amygdala etc.
Amygdala: processes emotional signals, especially fear. Somatic input.
Nucleus accumbens: processes emotional signals, especially reward.
Hippocampus: crucial for memory formation.

How GAGE explains Phineas:

Damasio: Effective decision making depends on integration of cognitive information with somatic markers.
Damage to VMPFC prevents this integration.
GAGE shows a plausible mechanism for integration that is disrupted by VMPFC damage.

Processes:

Emotional valence requires coordination of VMPFC, amygdala, hippocampus and nucleus accumbens.
Coordination occurs because of temporally coordinated spiking patterns. Analogy: band.
Output: positive or negative attitude toward an action

Future Applications:

Camerer on neuroeconomics

Assignment

Modelling assignment 5 is due March 31. See assignments.


Phil/Psych 446

Computational Epistemology Laboratory.

Paul Thagard

This page updated March 14, 2005