Now consider the room to be our brain and the person is replaced by a chain of neurons. The visual symbols of the Chinese alphabets are converted into a series of action potentials which are transmitted by a chain of interconnected neurons. This gives rise to the conscious understanding in our brain. But no individual neuron have the understanding of Chinese or have any idea what these symbols mean, it is just opening/closing the ion channels in response to a neurotransmitter and passing on the action potential to the next neuron. The same thing occurs in even in the whole network of neurons. This is analogous to the person following a set of instruction.Imagine a native English speaker who knows no Chinese locked in a room full of boxes of Chinese symbols (a data base) together with a book of instructions for manipulating the symbols (the program). Imagine that people outside the room send in other Chinese symbols which, unknown to the person in the room, are questions in Chinese (the input). And imagine that by following the instructions in the program the man in the room is able to pass out Chinese symbols which are correct answers to the questions (the output). The program enables the person in the room to pass the Turing Test for understanding Chinese but he does not understand a word of Chinese.
In the human mind, according to my belief anyway, there is a conceptualization what "wheather" is, and what "nice" and "awfu" are. — god must be atheist
You don't just jump from "a single neuron" to "full human consciousness" like that. — Outlander
It seems to me that to solve this riddle, we need a concise definition of "understanding".Both the man and the neuron have no understanding of chinese yet the brain will understand chinese, hence the room should too. — debd
Then the Turing Test isn't very good at determining some system's understanding of some symbol-system.The program enables the person in the room to pass the Turing Test for understanding Chinese but he does not understand a word of Chinese.
Hence it appears to me that consciousness is the property of whole systems on not of its isolated part, this has already been posited out as systems reply to Searle. — debd
Am I reading this right..? Are you suggesting that we have billions of conscious entities within one brain? I wonder, which neuron in my brain is my consciousness?Am I reading this right..? Are you using the Chinese room argument to suggest that individual neurons aren't conscious? — Kenosha Kid
Is the problem that the sentence is actually nonsense, or that there are no instructions for translating such an arrangement of scribbles? What does it mean for some string of scribbles to be nonsense?How would the Chinese room deal with nonsense? How would it translate A Spaniard in the works? — Banno
In A nice derangement of epitaphs Davidson argues that language is not algorithmic.
Searle is arguing much the same thing with the Chinese room. — Banno
I think that consciousness or understanding or perception at a particular point of time is the function of the structural and physiological state of the neuronal network at that point in time. — debd
Now consider the room to be our brain and the person is replaced by a chain of neurons. — debd
There are other variants of the thought experiment that are an even better fit for this, such as Ned Block's Chinese Nation thought experiment, where a large group of people performs a neural network computation simply by calling a list of phone numbers. The counterintuitive result here is that a functionalist would have to say that the entire system thinks, understands language, feels pain, etc. - whatever it is that it is functionally simulating - even though it is very hard to conceive of e.g. the Chinese nation, as a single conscious entity.
But I think this people-as-computer-parts gimmick is a red herring. Of course a part of a system is not equivalent to the entire system - that was never in contention. A wheel spoke is not a bicycle either. The real contention here is whether something that is not a person - a computer, for example - can have a functional equivalent of consciousness.
Another issue is that the contents of a computer's mind (if it has one) are immune from discovery using scientific methods. The only access to knowledge of computer mental states would be through first-person computer accounts, the reliability of which would be impossible to verify. Whether machines are conscious will forever be a mystery. This suggests that consciousness is unlike all other physical properties. — RogueAI
There are other variants of the thought experiment that are an even better fit for this, such as Ned Block's Chinese Nation thought experiment, where a large group of people performs a neural network computation simply by calling a list of phone numbers. The counterintuitive result here is that a functionalist would have to say that the entire system thinks, understands language, feels pain, etc. - whatever it is that it is functionally simulating - even though it is very hard to conceive of e.g. the Chinese nation as a single conscious entity. — SophistiCat
The real contention here is whether something that is not a person - a computer, for example - can have a functional equivalent of consciousness. — SophistiCat
How is this issue different from not having a first-person experience of another person's consciousness? Unless your real issue is that it's a computer rather than a person - but that is the same issue that Chinese Room-type thought experiments try to capitalize on (confusingly, in my opinion). — SophistiCat
Thanks, this is what I was trying to articulate. — debd
So are we just gonna ignore the fact that the person in the room passed the program instruction, and not the understanding of the Chinese language?The program enables the person in the room to pass the Turing Test for understanding Chinese but he does not understand a word of Chinese.
Now consider the room to be our brain and the person is replaced by a chain of neurons. — debd
What makes a neuronal network conscious but not a silicon network? Sounds like biological bias to me.I think that consciousness or understanding or perception at a particular point of time is the function of the structural and physiological state of the neuronal network at that point in time. — debd
This suggests that consciousness is unlike all other physical properties. — RogueAI
Only because of thinking of mind and body in conflicting dualistic terms creates the problem in the first place.That does not, however, change my point about the internal mental states of computers forever being a mystery. — RogueAI
I think Searle's thought experiment was rather a reaction to reductive takes on consciousness, particularly computational, functionalist ones: — SophistiCat
My response to the systems theory is quite simple: let the individual internalize all of these elements of the system. He memorizes the rules in the ledger and the data banks of Chinese symbols, and he does all the calculations in his head. The individual then incorporates the entire system. There isn't anything at all to the system that he does not encompass. We can even get rid of the room and suppose he works outdoors. All the same, he understands nothing of the Chinese, and a fortiori neither does the system, because there isn't anything in the system that isn't in him. If he doesn't understand, then there is no way the system could understand because the system is just a part of him.
Is the problem that the sentence is actually nonsense, or that there are no instructions for translating such an arrangement of scribbles? — Harry Hindu
So are we just gonna ignore the fact that the person in the room passed the program instruction, and not the understanding of the Chinese language? — Caldwell
This version has replies to critics.
My response to the systems theory is quite simple: let the individual internalize all of these elements of the system. He memorizes the rules in the ledger and the data banks of Chinese symbols, and he does all the calculations in his head. The individual then incorporates the entire system. There isn't anything at all to the system that he does not encompass. We can even get rid of the room and suppose he works outdoors. All the same, he understands nothing of the Chinese, and a fortiori neither does the system, because there isn't anything in the system that isn't in him. If he doesn't understand, then there is no way the system could understand because the system is just a part of him. — Banno
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.