Comments

  • Martha the Symbol Transformer
    The problem I see with the Chinese Room and your above example is that if you buy into the computational theory of mind you can see how each respectively fits into the theory. Alternatively, if you think there's something missing, you see how each respectively demonstrates that position as well. The analogies seem only to illustrate confirmation biases in intuition rather than insight into what is really going on.Soylent

    Fair enough. I think I've made the mistake of accepting Searle's setup. If I don't buy into the computational theory of mind, why would I expect the Chinese Room to work? Why would I expect a symbol manipulating system to pass the Turing Test (in a strong way)?
  • Martha the Symbol Transformer
    So I'm still waiting for the evidence that shows that people are doing more than manipulating symbols and that computers aren't.Michael

    Before language, there were animals who experienced and felt. That's what's fundamental. Language is late in the game. Symbols are parasitic.

    You ask how I know that a computer can't feel. That's missing the point. Symbols can't feel. To the extent that a computer only manipulates symbols, it isn't feeling or knowing anything, because there is no knowledge or feeling in symbols themselves, only what they stand in for.
  • Martha the Symbol Transformer
    This cuts both ways though, do humans/animals do something more than produce a programmed/hard-wired output in the proper situations?Soylent

    Yes, since they don't always produce the same output. Animals, and particularly humans, display a great deal of flexibility and variability There is also a question of what determines the proper situation. What is proper in a given situation? Often, human culture defines that.

    An example for the wild is an offspring nest where a video camera was setup and streamed online. The mother, for unknown reasons, started attacking the offspring chicks, and failed to feed them properly. That doesn't make much sense from an evolutionary point of view, but life is messy.

    Computers and robots have shown creativity and novelty within a specific domain.Soylent

    True.
  • Martha the Symbol Transformer
    So I'm still waiting for the evidence that shows that people are doing more than manipulating symbols and that computers aren't.Michael

    The reason is because symbol manipulation alone undermines itself. In order for there to be symbols to compute, the symbols have to be defined. Chinese symbols without Chinese speakers aren't actually symbols. They're random markings.

    The word or emoticon for grief isn't a word or emoticon if there is no grief. It either means something else, or nothing at all. You have to have the grief first before there can be a symbol invented to represent it.

    The argument here is that symbols can't be primary or fundamental. They are derived, invented, created to aid in communication or thinking.
  • Martha the Symbol Transformer
    Marchesk, stop avoiding. You said that your claim that humans can understand but that computers can't isn't dogma. You said that you have evidence. Tell me what that evidence is.Michael

    Actually, my contention was that symbol manipulation alone doesn't result in understanding. If a computer can be arranged to do more than symbol manipulation, then I'm not claiming it can't understand, because I don't know at that point.

    Searle's contention was that computers only manipulate symbols, however sophisticated.
  • Martha the Symbol Transformer
    It seems clear to me that you don't have any evidence that humans can feel but that computers can't. It seems clear to me that this is just a dogmatic assertions. I'm not sure why you're so unwilling to admit this.Michael

    First off, you agree that there is something more to feeling than producing a symbolic representation of feeling in the proper context, correct?
  • Martha the Symbol Transformer
    And being sexual means what? Feeling sexual arousal? You're just begging the question. And needing to reproduce means what? Having the desire to reproduce? You're just begging the question.Michael

    You can't be serious.
  • Martha the Symbol Transformer
    They could.Michael

    Are they, though? Have computers formed a linguistic community? Have they told us what the symbols of that community mean (or how they are used to use your definition of meaning)?
  • Martha the Symbol Transformer
    And what evidence shows that only animals experience sexual arousal?Michael

    I don't know. I guess hurricanes might be aroused when they hit shore of a major city.

    It probably has to do with animals being sexual, and needing to reproduce.
  • Martha the Symbol Transformer
    The linguistic community.Michael

    And computers form a linguistic community?

    Maybe. What evidence allows us to justify an answer either way?Michael

    Something about machines not being animals, probably.
  • Martha the Symbol Transformer
    How do you know what they can't?Michael

    I don't know. How do you know a rock can't be sexually stimulated?
  • Martha the Symbol Transformer
    Then as I keep asking, what evidence shows that humans can genuinely feel emotions but that computers/robots can't? Clearly it can't be empirical evidence because you're saying that outward behaviour can be "fake". So you have non-empirical evidence?Michael

    This is like asking how do I know computers/robots can't be sexually stimulated just because it can be faked.
  • Martha the Symbol Transformer
    Perhaps the input to which "grief" is the output? And if we go with something like the James-Lange theory then the input is physiological arousal.Michael

    Two questions here:

    1. Who or what determines what the proper output is?

    2. Do computers have physiological arousal?
  • Martha the Symbol Transformer
    So the presence of emotions is determined by public behaviour. Then if a robot behaves the same way a person does, e.g. saying "I'm sorry for your loss" when you tell them that your father has died, accompanied with the appropriate facial expressions and body language, then the robot has demonstrated his capacity for emotions.Michael

    No, it's not. A person can fake emotions, afterall. I might be convinced that you're sorry (or the robot), but maybe it's just mimicry. Maybe you don't actually feel sorry. Maybe you didn't like the person who died, or me, or just aren't close to the situation. Maybe you just aren't feeling empathetic. But you want to maintain a polite appearance.
  • Martha the Symbol Transformer
    Second, simply because we do not relate to a computer as well as we do to other humans doesn't mean a computer doesn't feel. The recent movie Ex Machina explores this.darthbarracuda

    Interesting that you mentioned that movie, since the machine in the movie manipulated the feelings of the protagonist in order to accomplish some other goal. The protagonist felt empathy for the machine and wanted to help it, not realizing that he was being fooled.

    There is another movie along these lines where journalist with a background in robotics is invited to do a piece on a successful roboticist. Turns out this person has managed to create a very human-like android and he wants the journalist to examine it. She ends up falling in love with the roboticist, and a bit disturbed by the android, because it's awkward in conversation, and begins to exhibit signs of jealousy and sexual interest.

    Turns out, the roboticist is actually the android, and the android is the roboticist. She's been fooled to see if the mimicry could be carried out convincingly, which it has, since she's fallen in love with a machine that's been programmed to mimic being a self-confident genius. The real person is a less convincing, awkward, but brilliant human.
  • Martha the Symbol Transformer
    Grief isn't a symbol, it's an experience. It can be communicated with symbols, but the symbols aren't grieving. As such, outputting grieving symbols in the right situation is not at all the same as experiencing grief.
  • Martha the Symbol Transformer
    Which means what?Michael

    We use symbols to communicate meaning.

    And what evidence shows that humans can provide meanings to the symbols but computers can't?Michael

    Searle's argument, as I understand it, is that computers (or any system) are unable to do this if all they're doing is manipulating symbols. Humans are doing something in addition when we produce symbols. The fundamental reason is that symbols aren't meaningful, rather they connotate meaning. They're symbols for a reason.
  • Martha the Symbol Transformer
    The correct question is "what's the difference between a computer taking in, manipulating, and outputting symbols and a human taking in, manipulating, and outputting symbols?" It's the one I've asked you, and it's the one I'm still waiting an answer for.Michael

    Humans provide meanings to the symbols in the first place, which is what you're ignoring.
  • Martha the Symbol Transformer
    Correct, abstract (and fictional people in stories) people don't actually grieve.Michael

    But here's the thing. The computer is taking in symbols, manipulating those symbols, and outputting symbols, correct? So what's the difference between that and a human writing out the algorithm for computing grief?

    A human could take the symbols for a funeral, write down the computations a Turing Machine would make, and output the symbols for grief, or whatever. In theory. Maybe a billion Chinese could do it. Would that system grieve?
  • Martha the Symbol Transformer
    You're just reasserting the claim that humans can understand and computers can't. I want to know what evidence supports this claim.Michael

    Computers are instantiations of Turing machines (limited by physics), correct? You agreed that an abstract Turing machine can't compute grief. What makes an instantiated Turing machine different?

    You might retort that abstract machines don't compute, but that's not quite right, because we can write out the algorithm for whatever computation, if we wanted to take the time and effort (within the limitations of our resources).

    So if there exists an algorithm for grief, why wouldn't the algorithm itself feel grief, or a written out version of Turing machine computing that algorithm? Is there something that an instantiated computer does with symbols that an abstraction doesn't?

    Is it the electricity flow through the gates? Does electricity give meaning to symbols?
  • Martha the Symbol Transformer
    You give it something, it does something with it, and then it gives you something back.Michael

    And what makes that meaningful?

    Do you actually have anything meaningful to say about the difference between humans and computers?Michael

    Humans give meaning to symbols, not the other way around. What a computer computes is only meaningful to the degree it's meaningful to us. We built them, after all, to compute things for us.

    1 + 1 = 2 is only meaningful to the extent that we give it the symbols meaning. Otherwise, it means nothing.
  • Martha the Symbol Transformer
    What does it mean for matter to be using symbols? What is it about a computer which results in use of symbols such that there is meaning?
  • Martha the Symbol Transformer
    Use gives meaning to symbols.Michael

    And an abstract Turing Machine can't be said to be using symbols, even if we wrote out the entire computation for being in grief, but a computer can, because it has electricity flowing through it between different parts?
  • Martha the Symbol Transformer
    Yes. Just as a human is made different from an abstraction by being made of matter.Michael

    So it is matter that gives meaning to symbols?
  • Martha the Symbol Transformer
    You're shifting the terms of understanding. If understanding is granted to the system for the accurate manipulation of the symbols, then human understanding is likewise granted for accurate manipulation of the symbols. It's not enough to have the symbols, one has to have the rules to manipulate the symbols.Soylent

    Right, and I'll accept that this is one notion of understanding, being that words can have multiple meanings. Siri knows how to tell me what the temperature outside is. "She" understands how to compute that result.

    Searle, and perhaps you, seems to want to isolate the understanding of the Chinese Room participant from the entire system, which includes the set(s) of rules. Martha doesn't need to know the meaning of the output, because the meaning is supplied by the entire system and not a single part of it.Soylent

    But Searle's point is that it doesn't matter, because it's still just a form of symbol manipulation. He thinks we do something fundamentally different than following rules to manipulate symbols when we speak English or Chinese, although of course we are capable of computing symbols, albeit not usually as well as a computer.
  • Martha the Symbol Transformer
    Nobody is saying that abstract things can have real emotions.Michael

    Right, so what makes a computer different than an abstraction, like a Turing Machine (of which a computer is a finite realization)? Is it that the computer is made of matter instead of symbols?
  • Martha the Symbol Transformer
    I already told you that 1 means any individual thing in context of counting or sets. I'm sure someone else can provide a better mathematical definition. This can run kind of deep because we could get start debating the exact meaning behind the symbol which might lead to a universals vs nominalism debate.
  • Martha the Symbol Transformer
    How so? You just asked why a bunch of symbols can't have emotions if humans have emotions. I just told you that symbols are stand ins for something else, in this case emotion. A happy face isn't happy. It means happy.
  • Martha the Symbol Transformer
    If we're just going to accept that the humans experience emotions then why not just accept that the Turing machine does?Michael

    Because symbols are abstractions from experience. They stand in for something else. An emoticon isn't happy or sad or mad. It just means that to us, because we can be happy, or mad or sad.
  • Martha the Symbol Transformer
    Sure. And you asked how it's come to mean this thing. I pointed out that we're provided with some input (of which there may be many that resemble one another in some empirical way), e.g. "•" or "••", and are told what to output, e.g. "1" or "2".Michael

    Again, that's not what "1" or "+" or "2" means, at all.
  • Martha the Symbol Transformer
    What evidence shows that humans can form emotional bonds and grieve but that computers can't? You can't use science because science can only ever use observable behaviour as evidence, and the premise of the thought experiment is that the computer has the same observable behaviour as a human.Michael

    Okay, let's set aside empirical matters and just accept that humans do experience emotion. What about Turing machines? Can a Turing machine, in just its abstract form, experience grief? Does that make any sense?

    What I mean is, say some brilliant mathematician/programmer defined the algorithm that some theoretical computer could use to compute being in grief, and wrote it down. Would that algorithm then experience grief? Let's say they pay someone to illustrate a Turing machine manipulating the symbols needed to compute the algorithm. Whole forests are cut down to print this thing out, but there it is. Are the symbols sad?
  • Martha the Symbol Transformer
    Then what does reference mean?Michael

    The mathematical symbol "1" means any item or unit ever, in the context of counting or sets. You can use it to denote any one thing.

    If I made up some word, say "bluxargy", and then defined with some other made up words, what does it reference? It references nothing, so reference can't be symbol manipulation.
  • Martha the Symbol Transformer
    If it's not dogma then there's evidence. What evidence shows that the computer who says "I'm sorry" doesn't understand and that the human who says "I'm sorry" does?Michael

    Humans form emotional bonds and machines don't. Do you need some scientific literature to back this up? Humans also grieve when those bonds are broken and machines don't.
  • Martha the Symbol Transformer
    When the input is "•" the output is "1". When the input is "••" the output is "2", etc.Michael

    That's not what reference means at all.
  • Martha the Symbol Transformer
    No, not dogma. It's really absurd to maintain otherwise, unless you're invoking some altered version of the other minds problem in which I'm the only one not doing symbol manipulation.

    But anyway, I'll try another approach which isn't about consciousness. Once we humans have an understanding of 1 + 1 (to use a trivial example), we can universalize it to any domain. A computer can't do that. It has to be programmed in different scenarios how to apply 1 + 1 to achieve whatever result.

    Sure, the computer always knows how to compute 2, but it doesn't know how to apply addition in various real world situations without being programmed to do so.
  • Martha the Symbol Transformer
    Perhaps that we dogmatically believe that people understand but computers don't?Michael

    We understand that people are doing something more than manipulating symbols. When I say that I understand your loss, you take it to mean I can relate to having lost someone, not that I can produce those symbols in the right situation, in which case I'm just formally being polite. If a machine says it, it's understood that someone programmed a machine to say it in those circumstances, which might come off as incredibly cold and insensitive, or downright creepy (if it hits the uncanny valley). What we don't do is think that the machine feels our pain or empathizes.

    It's the same with Siri telling me it's cold outside. It's cute and all, but nobody takes it seriously.
  • Martha the Symbol Transformer
    I might or I might not.Michael

    How would you not? Are you supposing that I have some condition where I can't experience pain or fatigue (I'm not aware that there are any humans immune to fatigue).
  • Martha the Symbol Transformer
    The problem with Searle's argument is that if a human was put under the same conditions as a computer then the human wouldn't understand (in the same way as a human in a traditional situation).Michael

    Right, because he was attacking symbol manipulation as a form of understanding.

    But a human is still conscious. So that a computer wouldn't understand (in the same way we would) under those same conditions is not that it is not conscious. He needs to put the computer under the same conditions that a human would be under to understand the sentences.Michael

    Okay, so there's consciousness-based understanding where the words, "I'm sorry for your loss" don't imply understanding unless symbol producer has experienced loss or can empathize with losing someone.

    And then there's the question of intentionality. How do symbols refer? How is it that 1 stands for any single individual item?
  • Martha the Symbol Transformer
    Perhaps I have yet to lose someone close and therefore am just being polite.

    So let's say you stub your toe. I say that looks painful. Are you going to doubt that I understand what being in pain is? Or if you tell me about a strange dream. Do you doubt that I will understand having a strange dream?
  • Martha the Symbol Transformer
    The machine detects water falling from the clouds and so outputs "it is raining". This would be a proper way to consider computer understanding.Michael

    The computer understands it in a propositional sense. Let's make this more complex. Let's say the computer has been programmed to read faces and emotion at a funeral. It then tells a grieving person that it's very sorry for their loss.

    Does the computer understand what it means to lose someone?