The wrong stuff: Chinese rooms and the nature of understanding

Philosophical Investigations 11 (October):279-99 (1988)
  Copy   BIBTEX

Abstract

Searle's Chinese Room argument is a general argument that proves that machines do not have mental states in virtue of their programming. I claim that the argument expresses powerful but mistaken intuitions about understanding and the first person point of view. A distinction is drawn between a competence sense and a performance sense of ‘understanding texts’. It is argued that the Chinese Room intuition looks for a special experience (performance) of comprehension, whereas artificial intelligence is attempting to explain the knowledge (competence) required to understand texts. Moreover, a dilemma is sketched for the argument: either Searle hasn't identified the appropriate subject of understanding or he may understand after all. Finally, I question the underlying assumption that the general definition of mental states requires a projectable‐by‐us first person point of view.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 93,932

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2009-01-28

Downloads
55 (#283,303)

6 months
9 (#436,568)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references