Review


Lecture notes

Phil/Psych 256
March 18, 1997

Physical Environments:

	Being-in-the-world

	Ecological Psychology

	Robotics

	Relationship to CRUM

Q: What is being-in-the-world?

	1. Heidegger argued against a "technological" understanding 
	of mind (CRUM) (cf. Dreyfus)

	2. Heidegger emphasized 

		- action over representation
		- skills and practices over procedures
		- embeddedness over modeling

	3. Unlike a computer, a human always acts in a situation

	4. Describe mind in copulas, participles, and gerundives:

	- "He believes the world is round" -->
	  "He is that the world is round"
	- a hammer is simply part of hammering
	- being-in-the-world (Dasein)

	5. Dreyfus suggests neural networks overcome these points

Q: What is ecological psychology?

	1. Gibson argued that perception is direct rather 
	than hypothetical

	2. We perceive "structural invariants" rather than 
	"signals," e.g.,

		- age and strain in face recognition
		- Omaha drum ceremony

	3. Action is governed by environmental "affordances," 
	rather than representational schemas (instructions), e.g.,

		- piano playing

	Affordances change with expertise.

Q: What is GOFAIR?

	Good Old Fashioned Artificial Intelligence and Robotics 
	(Mackworth):

	- assume agents are alone in the world
	- assume the environment is static and deterministic 
	(predictable)
	- assume agents are "omniscient:"
		- their beliefs are correct
		- their knowledge is complete/perfect
	- assume actions are discrete and serial

	Omniscient Fortune Teller Assumptions 
	(OFTA - Mackworth)

	Mackworth and Brooks argue that these assumptions are 
	unjustified

Q: Can robotics address these limitations?

	1. Switch from "off-line" to "on-line" 	processing

	2. Couple perception tightly with action
	("bottom-up" rather than "top-down" design)

	3. Work from simple behaviours to intelligent behaviours

	4. Work in a dynamic environment

	5. Take risks and correct errors

	Navigation has these abilities

		- locomotion
		- obstacle/threat avoiding
		- goal seeking

	Humans do more than navigate

Q: What about embedded language?

	1. There are many "bodily" metaphors, e.g.,

		- the head of the corporation
		- grasp an idea, kick it around
		- pick up information
		- the long arm of the law

	2. These may reflect the role of our bodies in our thinking 
	(Lakoff & Johnson)

	3. Humanoid bodies may be required for robots to develop 
	similar representations, "cognobotics" (Brooks)

Q: Can CRUM handle physical environments?

	A1. Expand:

		- add image representations may deal with objections 
		about embodiedness

		- subtract some representations of the external world, 
		external representation

		- increase the use of non-deterministic (probabilistic) 
		procedures and representations, allow for 
		action + correction

		- provide computers/robots with intrinsic motivations

	A2. Supplement:

		- neuroscience may affect the need to view all brain 
		processes in computational terms, restricting CRUM to 
		"high-level" cognition

Thursday:
		- social environment
		- Durfee

Phil/Psych 256
March 20, 1997

Social Environments:

	Biology and social intelligence

	Anthropology

	Distributed Artificial Intelligence

	Relationship to CRUM

Q: Is human intelligence intrinsically social?

	1. "Machiavellian monkey" hypothesis (Jolly)

	2. Encephalization Quotient (Jerison)

	3. Gene/meme analogy (Dawkins)

Q: What is culture?

	A1. A shared set of schemata for conducting relationships, 
	e.g, romance, marriage, kinship, friendship
		- role playing (Moffat)

	A2. Social institutions or conventions, e.g., transactions, 
	education, marriage (Quinn) 

	A3. Theories about culture itself, e.g., religion, morality, 
	castes... (Evans-Pritchard)

Example: The American model of marriage (Quinn)

	1. A romantic relationship:
		- sharedness
		- mutual benefit
		- lastingness

	2. A voluntary relationship:
		- compatibility
		- difficulty
		- effort
		- success/failure
		- risk

	3. An institution:
		- utilitarian exchange
		- contract

	4. A religious entity:
		- duty to raise a family
		- cheating is a sin

Example: Cognition in the wild (Hutchins)

	Naval navagation as a distributed task:

	1. Individuals are assigned responsibility and authority 
	for goals and subgoals (hierarchial/vertical)

	2. Roles/relationships provide varying lattitude for 
	behaviour (horizontal)

	3. Cognitive artifacts facilitate distributed computation, 
	e.g., maps, logbooks, compasses, GPS, radar...

	4. Vertical organization increases confirmation bias, 
	horizontal organization increases indecision

Q: What is a "social network"?

	Simple analogy between networks and social agents (Hutchins)

	Network					Agent
	------------				-----------
	pattern of links			who talks to whom
	pattern of weights			what they talk about
	activation of links			persuasiveness 	
	network settling			reaching consensus	

Q: What social skills to computational agents need? (Durfee)

	A1. Share tasks with other agents, e.g., contracting, 
	bidding, and share results (cognitive artifacts)

	A2. Resolve conflicting views or preferences, and resource 
	contention (institutions)

		- be polite (conventions)
		- set schedules
		- play by the rules

	A3. Be rational (theory of mind)

		- be committed (?)

	A4. Be considerate (horizontal organization)

	A5. Be responsible (vertical organization)

	A6. Make friends (roles)

Q: Can CRUM handle social environments?

	A1. Yes, by ignoring it.  Agents can just be selfish.

	A2. Yes, by providing the appropriate schemata, 
	conventions, etc.

	A3. Yes, by offloading representations into the social world

	A4. No.  Knowledge is a social construct - CRUM is a cruel 
	joke (Latour & Woolgar)

Next week:

	- Dynamic systems (van Gelder)
	- Mathematical knowledge

Further materials


Return to Phil/Psych 256 home page