Can a 'thinking machine', according to this definition(?), 'understand'? I suspect, if so, it can only understand to the degree it can recursively map itself — 180 Proof
When we turn to understanding, by contrast, some have claimed that a new suite of cognitive abilities comes onto the scene, abilities that we did not find in ordinary cases of propositional knowledge. In particular, some philosophers claim that the kind of mental action verbs that naturally come to the fore when we think about understanding—“grasping” and “seeing”, for example—evoke mental abilities “beyond belief”, i.e., beyond simple assent or taking-to-be-true (for an overview, see Baumberger, Beisbart, & Brun 2017). — SEP on Psychology of Understanding
4.4 The Other Minds Reply
Related to the preceding is The Other Minds Reply: “How do you know that other people understand Chinese or anything else? Only by their behavior. Now the computer can pass the behavioral tests as well as they can (in principle), so if you are going to attribute cognition to other people you must in principle also attribute it to computers.”
Searle’s (1980) reply to this is very short:
The problem in this discussion is not about how I know that other people have cognitive states, but rather what it is that I am attributing to them when I attribute cognitive states to them. The thrust of the argument is that it couldn’t be just computational processes and their output because the computational processes and their output can exist without the cognitive state. It is no answer to this argument to feign anesthesia. In ‘cognitive sciences’ one presupposes the reality and knowability of the mental in the same way that in physical sciences one has to presuppose the reality and knowability of physical objects.
Critics hold that if the evidence we have that humans understand is the same as the evidence we might have that a visiting extra-terrestrial alien understands, which is the same as the evidence that a robot understands, the presuppositions we may make in the case of our own species are not relevant, for presuppositions are sometimes false. For similar reasons, Turing, in proposing the Turing Test, is specifically worried about our presuppositions and chauvinism. If the reasons for the presuppositions regarding humans are pragmatic, in that they enable us to predict the behavior of humans and to interact effectively with them, perhaps the presupposition could apply equally to computers (similar considerations are pressed by Dennett, in his discussions of what he calls the Intentional Stance).
I think it is easy enough to say that understanding can not be discrete, i.e. that a system that can only do one thing (or a variety of things) well lacks agency for this purpose. However, at some point, a thing can do enough things well that it feels a bit like bad faith to say that it isn't an agent because you understand how it was constructed and how it behaves (indeed, if determinism obtains, the same could be said of people). — Ennui Elucidator
There absolutely is a significant difference. How are you going to teach anything, artificial or biological, what a banana is if all you give it are squiggly scratches on paper? It doesn't matter how many times your CAT tool translates "banana", it will never encounter a banana. The robot at least could encounter a banana.There's no significant difference. — Daemon
The question isn't about experiencing; it's about understanding. If I ask a person, "Can you go to the store and pick me up some bananas?", I am not by asking the question asking the person to experience anything. I am not asking them to be consciously aware of a car, to have percepts of bananas, to feel the edges of their wallet when they fish for it, etc. I am asking for certain implied things... it's a request, it's deniable, they should purchase the bananas, and they should actually deliver it to me. That they experience things is nice and all, but all I'm asking for is some bananas.A robot does not "encounter" things any more than a PC does. ... — Daemon
I disagree with the premise, "'When humans do X, it involves Y' implies X involves Y". What you're asking me to believe is in my mind the equivalent of that asking "Can you go to the store and pick me up some bananas?" is asking someone to experience something; or phrased slightly more precisely, that my expectations that they understand this equate to my expectations that they (consciously?) experience things. And I don't think that's true. I think I'm just asking for some bananas.When we encounter something, we experience it, we see it, feel it, hear it. — Daemon
The question isn't about experiencing; it's about understanding. — InPitzotl
If I ask a person, "Can you go to the store and pick me up some bananas?", I am not by asking the question asking the person to experience anything. — InPitzotl
IOW, your point was that robots aren't doing something that humans do, but that's kind of backwards from the point being made that you're replying to. It's not required here that robots are doing what humans do to call this significant; it suffices to say that humans can't understand without doing something that robots do that your CAT tool doesn't do. — InPitzotl
IOW, your point was that robots aren't doing something that humans do, but that's kind of backwards from the point being made that you're replying to. It's not required here that robots are doing what humans do to call this significant; it suffices to say that humans can't understand without doing something that robots do that your CAT tool doesn't do. — InPitzotl
Nonsense. There are people who have this "crucial element", and yet, have no clue what that question means. If experience is "the crucial" element, what is it those people lack?As I emphasised in the OP, experience is the crucial element the computer lacks, that's the reason it can't understand. — Daemon
The same applies to robots. — Daemon
Your CAT tool would be incapable of bringing me bananas if we just affix wheels and a camera on it. By contrast, a robot might pull it off. The robot would have to do more than just translate words and look up definitions like your CAT tool does to pull it off... getting the bananas is a little bit more involved than translating questions to Dutch.But in order to understand your question, the person must have experienced stores, picking things up, bananas and a multitude of other things. — Daemon
Neither your CAT tool nor a person who doesn't understand the question can do what a robot who brings me bananas and a person who brings me bananas do, which is to bring me bananas.Neither my CAT tool nor a robot do what I do, which is to understand through experience. — Daemon
I'm not arguing that robots experience things here. I'm arguing that it's a superfluous requirement. But even if you do add this superfluous requirement, it's certainly not the critical element. To explain what brings me the bananas when I ask for them, you have to explain how those words makes something bring me bananas. — InPitzotl
I'm not arguing that robots experience things here. I'm arguing that it's a superfluous requirement. But even if you do add this superfluous requirement, it's certainly not the critical element. To explain what brings me the bananas when I ask for them, you have to explain how those words makes something bring me bananas. You can glue experience to the problem if you want to, but experience doesn't bring me the bananas that I asked for. — InPitzotl
We're not trying to explain how you get bananas, we're trying to explain understanding. — Daemon
"Can you go to the store and pick me up some bananas?" — InPitzotl
A correct understanding of the question is comprised of relating it to a request for bananas. How this fits in to the world is how one goes about going to the store, purchasing bananas, coming to me and delivering bananas. You've added experiencing in there. You seem too busy trying to compare CAT tools not understanding and an English speaker understanding to relate understanding to the real test of it: the difference between a non-English speaker just looking at me funny and an English speaker bringing me bananas.My suggestion is that understanding something means relating it correctly to the world, which you know and can know only through experience.
I don't mean that you need to have experienced a particular thing before you can understand it, but you do need to know how it fits in to the world which you have experienced. — Daemon
The concept of understanding you talked about on this thread doesn't even apply to humans. If "the reason" the robot doesn't understand is because the robot doesn't experience, then the non-English speaker that looked at me funny understood the question. Certainly that's broken.You can redefine "understanding" in such a way that it is something a robot or a computer can do, but the "understanding" I am talking about is still there. — Daemon
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.