The Chinese room revisited : artificial intelligence and the nature of mind

Dissertation, Ku Leuven (2007)
  Copy   BIBTEX

Abstract

Charles Babbage began the quest to build an intelligent machine in the nineteenth century. Despite finishing neither the Difference nor the Analytical engine, he was aware that the use of mental language for describing the functioning of such machines was figurative. In order to reverse this cautious stance, Alan Turing postulated two decisive ideas that contributed to give birth to Artificial Intelligence: the Turing machine and the Turing test. Nevertheless, a philosophical problem arises from regarding intelligence simulation and make-believe as sufficient to establish that programmed computers are intelligent and have mental states, especially given the nature of mind and its characteristic first-person viewpoint. The origin of Artificial Intelligence is undoubtedly linked to the accounts that inspired John Searle to coin the term strong AI ―or the view that simply equates computers and minds. Especially emphasising the divergence between algorithmic processes and intentional mental states, the Chinese Room thought experiment shows that, since the mind is embodied and able to realise when linguistic understanding takes place, mental states require material implementation, a point that directly conflicts with the accounts that reduce the mind to the functioning of a programmed computer. The experience of linguistic understanding with its typical quale leads to other important philosophical issues. Searle’s theory of intentionality holds that intentional mental states have conditions of satisfaction and appear in semantic networks; thus people know when they understand and what terms are about. In contrast, a number of philosophers maintain that consciousness is only an illusion and that it plays no substantial biological role. However, consciousness is a built-in feature of the system. Moreover, neurological evidence suggests that conscious mental states, qualia and emotions enhance survival chances and are an important part of the phenomenal side of mental life and its causal underpinnings. This renders an important gap between simulating a mind and replicating the properties that allow having mental states and consciousness. On this score, the Turing test and the evidence it offers clearly overestimate simulation and verisimilar make-believe, since such evidence is insufficient to establish that programmed computers have mental life. In summary, this dissertation criticises views which hold that programmed computers are minds and minds are nothing but computers. Despite the arguments in favour of such an equation, they all fail to properly reduce the mind and its first-person viewpoint. Accordingly, the burden of proof still lies with the advocates of strong AI and with those who are willing to deny fundamental parts of the mind to make room for machine intelligence.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,386

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Reaping the whirlwind. [REVIEW]L. Hauser - 1993 - Minds and Machines 3 (2):219-237.
Chinese room argument.Larry Hauser - 2001 - Internet Encyclopedia of Philosophy.
Minds, machines and Searle.Stevan Harnad - 1989 - Journal of Experimental and Theoretical Artificial Intelligence 1 (4):5-25.
Searle's chinese room argument.Larry Hauser - unknown - Field Guide to the Philosophy of Mind.
Alan Turing’s Concept of Mind.Rajakishore Nath - 2020 - Journal of the Indian Council of Philosophical Research 37 (1):31-50.
Turing test: 50 years later.Ayse Pinar Saygin, Ilyas Cicekli & Varol Akman - 2000 - Minds and Machines 10 (4):463-518.

Analytics

Added to PP
2020-04-08

Downloads
53 (#294,453)

6 months
13 (#182,749)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Rodrigo González
University of Chile

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references