Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
4-4 searle’s chinese room
the thought experiment <ul><li>show flash animation </li></ul>
why there is no “mind” <ul><li>all the computer has is syntax, that is, all the computer has is a set of rules to follow <...
systems reply <ul><li>most common reply </li></ul><ul><li>the man in the room does not understand chinese. but, the man is...
robot reply <ul><li>we know about objects because we stand in a certain causal relation to them.  we know what a hamburger...
brain simulator reply <ul><li>consider a computer that operates in quite a different manner than the usual AI program with...
other minds reply <ul><li>we rely on behavior to determine that regular chinese speaking people understand chinese.  as th...
general issues <ul><li>searle claims that “Syntax is not sufficient for semantics.”  </li></ul><ul><li>but no one said it ...
Upcoming SlideShare
Loading in …5
×

4-4

616 views

Published on

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

4-4

  1. 1. 4-4 searle’s chinese room
  2. 2. the thought experiment <ul><li>show flash animation </li></ul>
  3. 3. why there is no “mind” <ul><li>all the computer has is syntax, that is, all the computer has is a set of rules to follow </li></ul><ul><li>the computer has no semantics; there is no meaning in any of the symbols (or groups of symbols) that the computer the computer manipulates. </li></ul><ul><li>since we do have meaning, and since the computer doesn’t, the computer cannot have a mind in the way that we do, and this is what we are interesting in when we ask the question of whether or not a computer can think </li></ul>
  4. 4. systems reply <ul><li>most common reply </li></ul><ul><li>the man in the room does not understand chinese. but, the man is but a part, a central processing unit (CPU), in a larger system. the larger system includes the memory (scratchpads) containing intermediate states, and the instructions — the complete system that is required for answering the chinese questions. while the man running the program does not understand chinese, the system as a whole does. </li></ul><ul><li>there is a category error here. the part of the brain responsible for processing language doesn’t understand language either. intentional states don’t belong to areas of the brain, they belong to people. in the same way, to talk about the man in the room not understanding chinese makes a mistake about the property of understanding being attributed to the wrong category of thing </li></ul>
  5. 5. robot reply <ul><li>we know about objects because we stand in a certain causal relation to them. we know what a hamburger is because we have seen one, probably tasted one, and, in short, have all types of causal relations to them </li></ul><ul><li>if the chinese room was in a robot body with all sorts of sensory devices, then the robot would stand in the appropriate causal relations such that it would understand the language </li></ul>
  6. 6. brain simulator reply <ul><li>consider a computer that operates in quite a different manner than the usual AI program with scripts and operations on strings of linguistic symbols. suppose instead the program simulates the actual sequence of nerve firings that occur in the brain of a native chinese language speaker when that person understands chinese — every nerve, every firing. since the computer then works the very same way as the brain of a native chinese speaker, processing information in just the same way, it will understand chinese. </li></ul><ul><li>there is also the possibility that searle has attempted to use the wrong computational strategy. if our brains don’t manipulate discrete symbols, then we shouldn’t try to get a computer to think by doing so. if our brain is a connectionist system, a vector transformer, then it seems that the system is just so complex that our intuitions fail us when we attempt to move from the part to the whole. </li></ul>
  7. 7. other minds reply <ul><li>we rely on behavior to determine that regular chinese speaking people understand chinese. as this is the accepted criteria, why should it not apply to mechanical devices as well as those made of meat </li></ul><ul><li>searle displays a certain kind of prejudice against machines and treats them unfairly </li></ul>
  8. 8. general issues <ul><li>searle claims that “Syntax is not sufficient for semantics.” </li></ul><ul><li>but no one said it was. a computer running a program is not the same as a program sitting on a shelf, either on a disk on written on paper. a computer is a causal system that changes state in accord with a program. chalmers offers a parody of searle’s argument in which it is reasoned that recipes are syntactic, syntax is not sufficient for crumbliness, cakes are crumbly, so implementation of a recipe is not sufficient for making a cake. </li></ul><ul><li>searle relies heavily on intentionality, but intentionality is not well understood. how intentionality arises and to what we should attribute it is highly debatable. </li></ul>

×