The problem I see with the Chinese Room and your above example is that if you buy into the computational theory of mind you can see how each respectively fits into the theory. Alternatively, if you think there's something missing, you see how each respectively demonstrates that position as well. The analogies seem only to illustrate confirmation biases in intuition rather than insight into what is really going on. — Soylent
So I'm still waiting for the evidence that shows that people are doing more than manipulating symbols and that computers aren't. — Michael
This cuts both ways though, do humans/animals do something more than produce a programmed/hard-wired output in the proper situations? — Soylent
Computers and robots have shown creativity and novelty within a specific domain. — Soylent
So I'm still waiting for the evidence that shows that people are doing more than manipulating symbols and that computers aren't. — Michael
Marchesk, stop avoiding. You said that your claim that humans can understand but that computers can't isn't dogma. You said that you have evidence. Tell me what that evidence is. — Michael
It seems clear to me that you don't have any evidence that humans can feel but that computers can't. It seems clear to me that this is just a dogmatic assertions. I'm not sure why you're so unwilling to admit this. — Michael
And being sexual means what? Feeling sexual arousal? You're just begging the question. And needing to reproduce means what? Having the desire to reproduce? You're just begging the question. — Michael
They could. — Michael
And what evidence shows that only animals experience sexual arousal? — Michael
How do you know what they can't? — Michael
Then as I keep asking, what evidence shows that humans can genuinely feel emotions but that computers/robots can't? Clearly it can't be empirical evidence because you're saying that outward behaviour can be "fake". So you have non-empirical evidence? — Michael
Perhaps the input to which "grief" is the output? And if we go with something like the James-Lange theory then the input is physiological arousal. — Michael
So the presence of emotions is determined by public behaviour. Then if a robot behaves the same way a person does, e.g. saying "I'm sorry for your loss" when you tell them that your father has died, accompanied with the appropriate facial expressions and body language, then the robot has demonstrated his capacity for emotions. — Michael
Second, simply because we do not relate to a computer as well as we do to other humans doesn't mean a computer doesn't feel. The recent movie Ex Machina explores this. — darthbarracuda
Which means what? — Michael
And what evidence shows that humans can provide meanings to the symbols but computers can't? — Michael
The correct question is "what's the difference between a computer taking in, manipulating, and outputting symbols and a human taking in, manipulating, and outputting symbols?" It's the one I've asked you, and it's the one I'm still waiting an answer for. — Michael
Correct, abstract (and fictional people in stories) people don't actually grieve. — Michael
You're just reasserting the claim that humans can understand and computers can't. I want to know what evidence supports this claim. — Michael
You give it something, it does something with it, and then it gives you something back. — Michael
Do you actually have anything meaningful to say about the difference between humans and computers? — Michael
Use gives meaning to symbols. — Michael
Yes. Just as a human is made different from an abstraction by being made of matter. — Michael
You're shifting the terms of understanding. If understanding is granted to the system for the accurate manipulation of the symbols, then human understanding is likewise granted for accurate manipulation of the symbols. It's not enough to have the symbols, one has to have the rules to manipulate the symbols. — Soylent
Searle, and perhaps you, seems to want to isolate the understanding of the Chinese Room participant from the entire system, which includes the set(s) of rules. Martha doesn't need to know the meaning of the output, because the meaning is supplied by the entire system and not a single part of it. — Soylent
Nobody is saying that abstract things can have real emotions. — Michael
If we're just going to accept that the humans experience emotions then why not just accept that the Turing machine does? — Michael
Sure. And you asked how it's come to mean this thing. I pointed out that we're provided with some input (of which there may be many that resemble one another in some empirical way), e.g. "•" or "••", and are told what to output, e.g. "1" or "2". — Michael
What evidence shows that humans can form emotional bonds and grieve but that computers can't? You can't use science because science can only ever use observable behaviour as evidence, and the premise of the thought experiment is that the computer has the same observable behaviour as a human. — Michael
Then what does reference mean? — Michael
If it's not dogma then there's evidence. What evidence shows that the computer who says "I'm sorry" doesn't understand and that the human who says "I'm sorry" does? — Michael
When the input is "•" the output is "1". When the input is "••" the output is "2", etc. — Michael
Perhaps that we dogmatically believe that people understand but computers don't? — Michael
I might or I might not. — Michael
The problem with Searle's argument is that if a human was put under the same conditions as a computer then the human wouldn't understand (in the same way as a human in a traditional situation). — Michael
But a human is still conscious. So that a computer wouldn't understand (in the same way we would) under those same conditions is not that it is not conscious. He needs to put the computer under the same conditions that a human would be under to understand the sentences. — Michael
The machine detects water falling from the clouds and so outputs "it is raining". This would be a proper way to consider computer understanding. — Michael
