(from M.Bazerman, Judgment in Rational Decision Making)
1. Define the problem.
2. Identify the criteria.
3. Weight the criteria.
4. Generate alternatives.
5. Rate each alternative on each criterion.
6. Compute the optimal decision.
1. Bounded rationality
2. Criteria are malleable, hard to compare.
3. Uncertainty and risk.
4. Neglect role of emotion in decision making.
5. Framing affects the decision problem. (see class 9b).
1. Maximize expected utility.
Expected utility (action) = Sum probability(outcome) + utility (outcome).
2. Utility =/ money: diminishing marginal utility.
3. But we often don't know probabilities and utilities.
1. Framing: structuring the question.
2. Gathering intelligence.
3. Coming to conclusions.
4. Learning from feedback.
1. Beliefs about what actions are possible.
2. Beliefs about what goals are relevant.
3. Beliefs about how various actions contribute to various goals:
Ask: What are the likely outcomes of your potential actions?
So error tendencies from the first half the class are relevant to decision making, e.g. spurious causal theories, motivated inference, communication distortions, bogus authority.
Plunging in (Russo, ch. 1)
Beginning to gather information and reach conclusions without thinking the issue through or thinking about how the decision should be made.
These are mental structures that people create to simplify and organize the world. They include all the ingredients of Bazerman's steps in rational decision making: problem definition, criteria (goals), and alternatives (options). They also include:
1. Not weighting equal costs consistently, e.g. losing a ticket vs. losing $30.
This can also be a yardstick error.
2. Sunk costs: failure to exclude the cost of past investment from calculation of future benefits.
Framing losses as more important than gains. E.g. building a road not to lose lives versus building a road to save lives.
What kind of highway to build, A or B?
1. A. Save 2000 lives.*
B. 1/3 probability of saving 6,000 lives, but 2/3 probability of saving no lives.
2. C. Lose 2000 lives.
D. 2/3 probability of losing no lives, but 1/3 probability of losing 6,000 lives.*
People prefer A (go for sure thing for gains) and D (take a risk to keep from losing).
Crucial whether problem is framed in terms of losses or gains. E.g. mug experiments.
Kahneman & Tversky's prospect theory.
Losses loom larger than gains.
We will pay a premium to avoid losses that we won't: risk seeking
pay less to achieve gains: risk averse
People hate pay cuts much more than they are bothered by foregone increases.
E.g. planned $1,000 raise cancelled.
pay cut $1,000.
These are a powerful source of understanding (see K. Holyoak and P. Thagard, Mental Leaps), but they can also be very misleading.
Marriage is a ...? Partnership, battlefield, prison, contract, etc.
Quebec separation would be like ...? U.S civil war, Czechoslovakia, Yugoslavia, etc, marital divorce, etc.
1. Know your own frames. Use frame analysis worksheet.
2. Know the frames of others, i.e. how they view the decision situation.
3. Reframe by generating alternative frames and selecting the best.
4. Challenge your own frame by seeking other
opinions, role-playing your adversaries and others, brainstorming,
trying other analogies and metaphors, and monitoring the changing
Frame blindness (Russo, ch. 2).
Tendency to solve the wrong problem because your mental framework prevents you from seeing the best options and important objectives.
Inconsistent weighting of costs (Russo, ch. 2)
Tendency to understand costs and losses differently in different situations, even when the costs and losses should be the same.
Sunk costs (Russo, ch. 2)
Tendency to make decisions on the basis of past investment rather than expected future value.
Framing losses as more important than gains (Russo, ch. 2)
Tendency to become risk seeking in order to avoid losses.
Bad metaphors or analogies (Russo, ch. 2)
Tendency to frame a decision using metaphors or analogies that give a misleading understanding of the problem situation.
Lack of frame control (Russo, ch. 3)
Tendency to define the problem in only one way or to be unduly influenced by the frames of others.
Updated March 12, 2003
Back to Phil 145.