Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Chinese room
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==== Brain simulator reply ==== Suppose that the program simulated in fine detail the action of every neuron in the brain of a Chinese speaker.<ref>{{Harvnb|Searle|1980|pp=7–8}}; {{Harvnb|Cole|2004|pp=12–13}}; {{Harvnb|Hauser|2006|pp=3–4}}; {{Harvnb|Churchland|Churchland|1990}}.</ref>{{efn|The brain simulation reply has been made by [[Paul Churchland]], [[Patricia Churchland]] and [[Ray Kurzweil]].{{sfn|Cole|2004|p=12}}}} This strengthens the intuition that there would be no significant difference between the operation of the program and the operation of a live human brain. Searle replies that such a simulation does not reproduce the important features of the brain—its causal and intentional states. He is adamant that "human mental phenomena [are] dependent on actual physical–chemical properties of actual human brains."{{sfn|Searle|1980|p=13}} Moreover, he argues: {{blockquote|[I]magine that instead of a monolingual man in a room shuffling symbols we have the man operate an elaborate set of water pipes with valves connecting them. When the man receives the Chinese symbols, he looks up in the program, written in English, which valves he has to turn on and off. Each water connection corresponds to a synapse in the Chinese brain, and the whole system is rigged up so that after doing all the right firings, that is after turning on all the right faucets, the Chinese answers pop out at the output end of the series of pipes. Now where is the understanding in this system? It takes Chinese as input, it simulates the formal structure of the synapses of the Chinese brain, and it gives Chinese as output. But the man certainly doesn't understand Chinese, and neither do the water pipes, and if we are tempted to adopt what I think is the absurd view that somehow the conjunction of man and water pipes understands, remember that in principle the man can internalize the formal structure of the water pipes and do all the "neuron firings" in his imagination.{{sfn|Searle|1980|p={{Page needed|date=January 2019}}}}}} =====China brain===== What if we ask each citizen of China to simulate one neuron, using the telephone system to simulate the connections between [[axon]]s and [[dendrite]]s? In this version, it seems obvious that no individual would have any understanding of what the brain might be saying.<ref>{{Harvnb|Cole|2004|p=4}}; {{Harvnb|Hauser|2006|p=11}}.</ref>{{efn|Early versions of this argument were put forward in 1974 by [[Lawrence Davis (scientist)|Lawrence Davis]] and in 1978 by [[Ned Block]]. Block's version used walkie talkies and was called the "Chinese Gym". Paul and Patricia Churchland described this scenario as well.{{sfn|Churchland|Churchland|1990}}}} It is also obvious that this system would be functionally equivalent to a brain, so if consciousness is a function, this system would be conscious. =====Brain replacement scenario===== In this, we are asked to imagine that engineers have invented a tiny computer that simulates the action of an individual neuron. What would happen if we replaced one neuron at a time? Replacing one would clearly do nothing to change conscious awareness. Replacing all of them would create a digital computer that simulates a brain. If Searle is right, then conscious awareness must disappear during the procedure (either gradually or all at once). Searle's critics argue that there would be no point during the procedure when he can claim that conscious awareness ends and mindless simulation begins.<ref>{{Harvnb|Cole|2004|p=20}}; {{Harvnb|Moravec|1988}}; {{Harvnb|Kurzweil|2005|p=262}}; {{Harvnb|Crevier|1993|pp=271 and 279}}.</ref>{{efn|An early version of the brain replacement scenario was put forward by [[Clark Glymour]] in the mid-70s and was touched on by [[Zenon Pylyshyn]] in 1980. [[Hans Moravec]] presented a vivid version of it,{{sfn|Moravec|1988}} and it is now associated with [[Ray Kurzweil]]'s version of [[transhumanism]].}}{{efn|Searle does not consider the brain replacement scenario as an argument against the CRA, however in another context, Searle examines several possible solutions, including the possibility that "you find, to your total amazement, that you are indeed losing control of your external behavior. You find, for example, that when doctors test your vision, you hear them say 'We are holding up a red object in front of you; please tell us what you see.' You want to cry out 'I can't see anything. I'm going totally blind.' But you hear your voice saying in a way that is completely outside of your control, 'I see a red object in front of me.' [...] [Y]our conscious experience slowly shrinks to nothing, while your externally observable behavior remains the same."{{sfn|Searle|1992}}}} (See [[Ship of Theseus]] for a similar thought experiment.)
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Chinese room
(section)
Add topic