Phil 145, Week 2

2a Data misinterpretation

Thinking Strategy: Causal Reasoning

A and B occur together.

A occurs before B.

Changing A changes B.

A is a better explanation of B than other possible causes.

Therefore, A causes B.

Examples: sex causes pregnancy, HIV causes AIDS.

Bad Causal Reasoning

Spurious causal theories

A causes B (belief acquired from social prejudice and other unreliable sources)

So A and B occur together.


Being a woman causes mechanical incompetence.

So there's another bad woman driver.

Note how this can interact with other error tendencies such as the Clustering Illusion and Confirmation Bias (below).


A is similar to B.

So A is causally related to B.


Red hair is similar to fire which is hot.

So redheads are hot-tempered.

Portfolio Instructions

For your portfolio: try to find examples of these in the reasoning you encounter in your friends, on TV, in newspapers, etc. For each error tendency, provide:

1. brief description of the error tendency;

2. brief description of the observed thinking;

3. explanation of how the observed thinking illustrates the error tendency.

Confirmation bias

Example: Do students who drink a lot of alcohol get poor grades?

   Poor grades  Good grades
 Drinks a lot  don't just focus on this cell  look for counterexamples
 Drinks a little  also relevant  

Need to consider all 4 cells.

We tend to be overimpressed by the confirming information. Effect on stereotypes. If we have a bad causal theory that A causes B, we will notice only examples of A's that are B's.

Bias is both in noticing and in seeking more information. This bias exists in memory.

Tendency for people to find whatever is presented as evidence for their own position. Cold cognition as well as hot.

Beliefs tend to persevere once in existence. This is not all bad: you can't be changing your mind all the time.

The problem of absent data:

Sometimes the relevant statistics don't exist, e.g. low GRE students in grad school. We never get to observe what would have happened if we had acted differently, because students with low GREs are not admitted.

   High GRE  Low GRE
Do well  lots of examples  no information
 Do poorly    no information

Another example, free trade:

   unemployment up  unemployment down
 free trade    
 no free trade  no information available  no information available


Self-fulfilling prophecies.

E.g. identifying certain students as smart.

Special case of hidden data phenomena

Turn little effects into big effects, e.g. despondent student not studying.

Reasoning Strategy: Correlation

A and B tend to occur together.

Not-A and not-B tend to occur together.

A and not-B do not tend to occur together.

Not-A and B do not tend to occur together.

Therefore, A and B are positively correlated.

Consider all four cells:

   B  Not B
 A  don't just focus on this cell  look for counterexamples
 Not A    



Error Tendencies

Confirmation bias (Gilovich ch. 3)

Tendency to seek information that supports your views and to ignore information that contradicts them.

The problem of absent data (Gilovich ch. 3)

Tendency to be over-confident about conclusions despite the absence of relevant information.

Self-fulfilling prophecies (Gilovich ch. 3)

Tendency for expectations to affect the world in ways that make the expectation true.

Gambler's fallacy (Schick, ch. 3)

Tendency to view chance as a self-correcting process in which a deviation in one direction is corrected in the opposite direction, e.g. expecting tails after a string of heads.


2b, Biased Evaluation of Ambiguous Data

Scientific bias.

Question: Is this class biased against other modes of thought? Assumes science is best way of knowing.

Contrast new age, meditation (Natural Law Party), magic.

Justify bias: choose the most reliable, intersubjective methods. Truth?

E.g. Velikovsky: World in Collision.

Fraud in science does happen. So does incompetence. But it's still the best way to go.

Ambiguity and bias


People tend to interpret the ambiguous in the direction of the rest of their knowledge. Look for coherence. Mixed body of evidence can take people in either direction.

People may spend MORE time looking at the negative, but spend that time figuring out how to discount it.

Science uses blinded observers to help to overcome this.


People see vague information as uncannily descriptive.

e.g. astrology.

Compare negative horoscopes: less vague, less fit with motivated reasoning.

Fortune tellers: tell vague story that is made coherent by the listener.


Do people remember successes and forget failures?

Asymmetries can cause differential recall:

1. Hedonic: notice what is unpleasant, e.g. chores.

Everyone thinks they do more of the work.

Bread always falls jelly side down. Murphy's law.

1-sided outcomes.

2. Pattern, e.g. great play -> come to bat.

3. Definitional: e.g. only recover after hitting rock bottom.

4. Base rate departures, e.g. cancer remission.

Conclusion: people tend to see what they expect to see, as well as what they want to see.

Overconfidence (Russo, ch. 4)

Question: Do you think that people tend to have too much confidence in their opinions, too little, or about right?

Psychological experiments find that people tend to put too much trust in their own opinions.

Watson: the U.S. will never need more than a dozen computers.

Gates: computers will never need more than 256k of memory.

Overconfidence encourages confirmation bias.

Overconfidence can also encourage insufficient anchor adjustment, i.e. getting stuck with an initial biased estimate.

Overconfidence also encourages hindsight bias, believing after the fact that you were right all along.

Corrective strategies:

Error Tendencies

Ambiguity (Gilovich, ch. 4)

Tendency to interpret ambiguous (more than one meaning) information in ways that fit our preconceptions.

Vagueness (Gilovich, ch. 4)

Tendency to interpret vague (no clear meaning) information in ways that fit our preconceptions.

Asymmetric recall (Gilovich, ch. 4)

Tendency to remember only one side of a situation, e.g. the unpleasant side.

Overconfidence in your judgment (Russo, ch. 4)

Tendency to fail to collect key factual information because of being too sure of assumptions and opinions.

Insufficient anchor adjustment (Russo, ch. 4)

Tendency to let an arbitrary starting point bias a final answer.

Hindsight bias (Russo, ch. 8)

Tendency to misremember your earlier attitudes based on later knowledge of outcomes.

Page updated Jan. 13, 2003

Back to Phil 145.