1. Sensory experience: vision, etc.
2. Thoughts.
3. Feelings: pains, emotions.
4. Ourselves
Possible answers:
1. Side effect of complex processing.
2. Useful for teaching.
3. Useful interrupt for parallel processing: resolve disputes.
4. Social: need to understand mental states of other people.
1. Dualism: consciousness is a spiritual state.
2. Identity theory: consciousness is a brain state.
3. Functionalism:
Consciousness is a big problem for functionalism, since it seem that we could have a computational system that does all that people do and not be conscious at all. E.g. program the population of China to pass symbols around.
4. Eliminative materialism: consciousness is one of those features of folk psychology that will be eliminated as science develops.
5. Integrative materialism: Put together the best possible theory of consciousness that takes seriously phenomenology (individual experience), psychology (behavioral experiments), and neuroscience, perhaps within a computational framework. Owen Flanagan, Consciousness Reconsidered.
6. Mysterious materialism (McGinn): Consciousness is somehow material, but we will never be able to figure out how.
7. Consciousness is central to understanding the mind, but is somehow to be understood neurologically and non-computationally. John Searle, The Rediscovery of the Mind.
1. Finite modalities: 6 senses and stream of thought.
2. Unity: experiences are tied together.
3. Intentionality: we are conscious of things.
4. Subjective feeling
5. "Only a being that could have conscious intentional states could have intentional states at all." p. 132.
6. Figure-ground, gestalt structure.
7. Familiarity.
8. Overflow: reference beyond immediate content.
9. Center and periphery.
10. Boundary conditions.
11. Mood.
12. Pleasure/unpleasure dimension.
1. Not a part of folk psychology?
2. Freud: many actions the result of unconscious (repressed?) desires.
3. Behaviorism: conciousness and unconsciousness both rejected as unscientific.
4. Modern cognitive science: most thinking is unconscious.
5. Searle (p. 152): "The notion of an unconscious mental state implies accessibility to consciousness."
David Chalmers, etc.:
I can imagine a being just like us physically but without consciousness.
So brains are not essential to consciousness.
University of Arizona Center for Consciousness Studies
David Chalmers on consciousness
Representational theories of consciousness
Cognitive science has been too much concerned with internal representations and computations and has not noticed the role that interactions with the world play in human thought.
Moderate embodiment thesis: Language and thought are inextricably shaped by embodied action.
Extreme embodiment thesis: Embodiment refutes the computational-representational approach to mind.
1. Heidegger. Being and Time, 1927. See Dreyfus' commentary" Being -in-the-world, 1991
2. Rejection of view that knowledge can be made explicit. What matters is the non-representable background.
Emphasis not on beliefs,but on skills and practices.
Common-sense knowledge is knowledge-how, not knowledge-that.
Holism: knowledge is interconnected.
3. Philosophy = study of being.
Dasein: everyday human existence.
Existing = being-in-the-world. Vs. Descartes.
4. Unlike a computer, a human is always in a situation, attuned to the context.
1. Dreyfus a Berkeley philosopher, main critic of AI.
2. Made some useful criticism of the early hubris of AI:
Neglect of importance of background knowledge, context, common sense (cf. Lenat), holistic knowledge, skills.
What Computers Can't Do, 1972, 1979, 1992.
3. Concluded that AI is completely wrong-headed because it neglects the importance of context and situation, the body, and human needs.
4. Endorsed by Winograd and Flores, 1986: influence on AI work on natural language understanding, planning (Agre and Chapman)
5. Influence at Institute for the Learning Sciences: Clancy.
6 1992: Dreyfus endorses PDP approach
- likes deemphasis of rules
- worries that it is not flexible to appreciate background
1. Grant that Dreyfus has pointed to aspects of knowledge neglected by AI.
2. But note that he has not appreciated the flexibility of AI in devising new ways to deal with knowledge, e.g. images, cases, frames (concepts).
3. Moreover, he has no real alternative theory:
4. Cognitive science needs many ideas and levels of explanation to be integrated in a whole: integrative materialism.
1. Brooks and Mackworth on robots: make robots responsive to environment; don't build everything in.
2. Gibson and Neisser on affordances: we don't have to represent the whole expernal world, because we can interact with it.
3. External representations: use diagrams and other kinds of external memory.
4. Situated action view of learning and problem solving.
Suchman et al: anthropological approach to workplace, learning situations. Ethnomethodology.
5. The importance of the body: Mark Johnson on The Body in the Mind.
Cf. image schemas.
Note also social aspect of situation - next class.
5. Searle's Chinese room argument.
If I am merely an input-output device for symbol processing, then I don't understand Chinese.
Similarly, computers have syntax only, not semantics: their symbols get their meaning only from human users, not from relation to the world.
Computers lack intentionality, the property of being about the world.
Responses to Searle:
1. Ignore it? No: the points about the importance of the body and the physical environment are too convincing. We don't want disembodied minds.
2. Expand CRUM? Perhaps add new kinds of representations to deal with perceptual and visual images.
Note: Dreyfus endorses connectionism in 3rd edition of What Computers Can't Do. So maybe his complaint wasn't against CRUM, but against symbolic versions thereof.
3. Supplement CRUM? Add new ideas about nature of interaction with the enviroment. E.g. control theory (Mackworth). Have robots that interact with the world and learn about, thereby acquiring intentionality.
4. Abandon CRUM? No: the world challengers do not have a fully worked out alternative
theory of mind, and have little to say about the diverse kinds of problem solving
and learning that rules, concepts, analogs, etc. can explain.
Computational Epistemology Laboratory.
This page updated Nov.17, 2015