Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
John Searle
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Consciousness=== Building upon his views about intentionality, Searle presents a view concerning consciousness in his book ''The Rediscovery of the Mind'' (1992). He argues that, starting with [[behaviorism]], an early but influential scientific view, succeeded by many later accounts that Searle also dismisses, much of modern philosophy has tried to deny the existence of consciousness, with little success. In ''Intentionality'', he parodies several alternative theories of consciousness by replacing their accounts of intentionality with comparable accounts of the hand: :No one would think of saying, for example, "Having a hand is just being disposed to certain sorts of behavior such as grasping" (manual behaviorism), or "Hands can be defined entirely in terms of their causes and effects" (manual [[functionalism (philosophy of mind)|functionalism]]), or "For a system to have a hand is just for it to be in a certain computer state with the right sorts of inputs and outputs" (manual [[Turing machine]] functionalism), or "Saying that a system has hands is just adopting a certain stance toward it" (the [[intentional stance|manual stance]]) (p. 263). Searle argues that philosophy has been trapped by a [[false dichotomy]]: that, on one hand, the world consists of nothing but objective particles in fields of force, but that yet, on the other hand, consciousness is clearly a subjective first-person experience. Searle says simply that both are true: consciousness is a real subjective experience, caused by the physical processes of the brain. (A view which he suggests might be called ''[[biological naturalism]]''.) ====Ontological subjectivity==== Searle has argued<ref>Searle, J R: ''The Mystery of Consciousness'' (1997) p. 95-131</ref> that critics like [[Daniel Dennett]],<ref>''[https://www.nybooks.com/articles/1982/06/24/the-myth-of-the-computer-an-exchange/ The Myth of the Computer: An Exchange]'' by Daniel C. Dennett, reply byΒ John R. Searle, The [[New York Review of Books]], June 24, 1982 Issue</ref> who he claims insist that discussing subjectivity is unscientific because science presupposes objectivity, are making a [[category error]]. Perhaps the goal of science is to establish and validate statements which are ''epistemically'' objective, i.e., whose truth can be discovered and evaluated by any interested party, but are not necessarily ''ontologically'' objective. Searle calls any [[value judgment]] epistemically ''subjective''. Thus, "[[Mount McKinley|McKinley]] is prettier than [[Mount Everest|Everest]]" is "epistemically subjective", whereas "McKinley is higher than Everest" is "epistemically objective". In other words, the latter statement is evaluable, in fact, falsifiable, by an understood ('background') criterion for mountain height, like "the summit is so many meters above sea level". No such criteria exist for prettiness. Beyond this distinction, Searle thinks there are certain phenomena, including all conscious experiences, that are ''ontologically'' subjective, i.e., can only exist as subjective experience. For example, although it might be subjective or objective in the epistemic sense, a doctor's note that a patient suffers from back pain is an epistemically ''objective'' claim: it counts as a medical diagnosis only because the existence of back pain is "an objective fact of medical science".<ref>Searle, J.R.: ''The Mystery of Consciousness'' (1997) p.122</ref> The pain itself, however, is ''ontologically subjective'': it is only experienced by the person having it. Searle goes on to affirm that "where consciousness is concerned, the existence of the appearance ''is'' the reality".<ref>Searle, J.R.: ''The Mystery of Consciousness'' (1997) p.112</ref> His view that the epistemic and ontological senses of objective/subjective are cleanly separable is crucial to his self-proclaimed [[biological naturalism]], because it allows epistemically objective judgments like "That object is a pocket calculator" to pick out agent-relative features of objects, and such features are, on his terms, ontologically subjective, unlike, say, "That object is made mostly of plastic". ====Artificial intelligence==== {{See also|Chinese room|philosophy of artificial intelligence}} Biological naturalism implies that if humans want to create a conscious being, they will have to duplicate whatever physical processes the brain goes through to cause consciousness. Searle thereby means to contradict what he calls "[[Chinese room#Strong AI|Strong AI]]", defined by the assumption that "the appropriately programmed computer really is a mind, in the sense that computers given the right programs can be literally said to ''understand'' and have other cognitive states."<ref name=":2" /> In 1980, Searle presented the "[[Chinese room]]" argument, which purports to prove the falsity of strong AI.<ref name=":2">[http://www.bbsonline.org/Preprints/OldArchive/bbs.searle2.html "Minds, Brains and Programs"] {{webarchive|url=https://web.archive.org/web/20010221025515/http://www.bbsonline.org/Preprints/OldArchive/bbs.searle2.html |date=2001-02-21}}, ''The Behavioral and Brain Sciences''.3, pp. 417β424. (1980)</ref> A person is in a room with two slits, and they have a book and some scratch paper. This person does not know any Chinese. Someone outside the room slides some Chinese characters in through the first slit; the person in the room follows the instructions in the book, transcribing the characters as instructed onto the scratch paper, and slides the resulting sheet out by the second slit. To people outside the room, it appears that the room speaks Chinese β they have slid Chinese statements into one slit and got valid responses in English β yet the 'room' does not understand a word of Chinese. This suggests, according to Searle, that no computer can ever understand Chinese or English, because, as the [[thought experiment]] suggests, being able to 'translate' Chinese into English does not entail 'understanding' either Chinese or English: all that the person in the thought experiment, and hence a computer, is able to do is to execute certain syntactic manipulations.<ref>{{Cite web|url=http://globetrotter.berkeley.edu/people/Searle/searle-con4.html|title=Conversation with John Searle, p.4 of 6|website=globetrotter.berkeley.edu}}</ref><ref name="Roberts">{{cite journal |last1=Roberts |first1=Jacob |title=Thinking Machines: The Search for Artificial Intelligence |journal=Distillations |date=2016 |volume=2 |issue=2 |pages=14β23 |url=https://www.sciencehistory.org/distillations/magazine/thinking-machines-the-search-for-artificial-intelligence |access-date=March 22, 2018 |archive-url=https://web.archive.org/web/20180819152455/https://www.sciencehistory.org/distillations/magazine/thinking-machines-the-search-for-artificial-intelligence |archive-date=August 19, 2018 |url-status=dead}}</ref> [[Douglas Hofstadter]] and [[Daniel Dennett]] in their book ''[[The Mind's I]]'' criticize Searle's view of AI, particularly the Chinese room argument.<ref>Hofstadter, D., 1981, 'Reflections on Searle', in Hofstadter and Dennett (eds.), The Mind's I, New York: Basic Books, pp. 373β382.</ref> [[Stevan Harnad]] argues that Searle's "Strong AI" is really just another name for [[Functionalism (philosophy of mind)|functionalism]] and [[computationalism]], and that these positions are the real targets of his critique.<ref>[http://cogprints.org/4023/ Harnad, Stevan (2001)], "What's Wrong and Right About Searle's Chinese Room Argument", in M.; Preston, J., ''Essays on Searle's Chinese Room Argument'', Oxford University Press.</ref> Functionalists argue that consciousness can be defined as a set of informational processes inside the brain. It follows that anything that carries out the same informational processes as a human is also conscious. Thus, if humans wrote a computer program that was conscious, they could run that computer program on, say, a system of ping-pong balls and beer cups and the system would be equally conscious, because it was running the same information processes. Searle argues that this is impossible, contending that consciousness is a physical property, like digestion or fire. No matter how good a simulation of digestion is built on the computer, it will not digest anything; no matter how well it simulates fire, nothing will get burnt. By contrast, informational processes are ''observer-relative'': observers pick out certain patterns in the world and consider them information processes, but information processes are not things-in-the-world themselves. Since they do not exist at a physical level, Searle argues, they cannot have ''causal efficacy'' and thus cannot cause consciousness. There is no physical law, Searle insists, that can see the equivalence between a personal computer, a series of ping-pong balls and beer cans, and a pipe-and-water system all implementing the same program.<ref>Searle 1980</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
John Searle
(section)
Add topic