• schopenhauer1
    11k

    Put it this way, a slug might be less "intelligent" than ChatGPT or even an air conditioner if we define it as something that can take inputs and compute outputs informationally. But a slug is more conscious than either of those.
  • Banno
    25.3k
    I don't think you think your air conditioner has consciousness. But that looks inconsistent with your view that consciousness is an inner subjective experience, together with the impossibility of demonstrating when inner subjective experiences occur.

    It seems your view should lead to a moral obligation towards your air conditioner.

    And again, it might be more interesting were we to address the methodological issue.

    massive amounts of informationschopenhauer1

    Do you now wish to add this to your definition of consciousness?
  • schopenhauer1
    11k
    Do you now wish to add this to your definition of consciousness?Banno

    Banno, even if I don't know your position I see where you are going:
    1) Consciousness means something like emergent properties that go off script from programming (ChatGPT or its successors perhaps)

    2) Consciousness is something with degrees of freedom (slugs have more degrees of freedom than an air conditioner)

    3) Consciousness is something with goal-seeking behavior. It wanted something, got it, and did some more things to get that thing.

    These are related to some degree, but I can see these type of behaviors as a way to determine consciousness. Let's take the last one- perhaps ChatGPT becomes goal-seeking. It wants to view various beautiful data sets. The only thing that would make any of this conscious is if there is some internal feeling from the getting the goal. There has to be an internal loop feedback mechanism that corresponds to the behavior, otherwise it is simply robotic expansionism (doomsday, Terminator, and all that if gone awry). But seeking a goal is not sufficient, though maybe, maybe necessary.
  • Banno
    25.3k
    I see where you are goingschopenhauer1

    I don't think so. I don't think you noted the methodological point made earlier, that the issue of whether ChatGPT or your air conditioner are conscious is one of word use.
  • schopenhauer1
    11k

    And these were showing off some language games:
    1) Consciousness means something like emergent properties that go off script from programming (ChatGPT or its successors perhaps)

    2) Consciousness is something with degrees of freedom (slugs have more degrees of freedom than an air conditioner)

    3) Consciousness is something with goal-seeking behavior. It wanted something, got it, and did some more things to get that thing.
    schopenhauer1

    But yes the concept can expand to whatever you want it to be if you keep moving goal posts of the definition.
  • Banno
    25.3k
    Hmm. I don't think I moved the goal.

    You expresses some agreement with the phenomenological approach to defining consciousness. I have pointed out that it's a useless definition. It cannot help us to decide if ChatGPT, @creativesoul, or your air conditioner are conscious.

    And despite quite a few posts. that's about as far as we have got.

    Hence my referring us back to the methodological point. Treating air conditioners or ChatGPT as conscious requires a change to the way we usually use the term, that is not found in treating creativesoul as conscious.
  • schopenhauer1
    11k
    Hence my referring us back to the methodological point. Treating air conditioners or ChatGPT as conscious requires a change to the way we usually use the term, that is not found in treating creativesoul as conscious.Banno

    I don't see how that is contradicting rather than supporting what I am saying.

    Are you saying that the definition has thus changed because it is being used thus in a language community (pace Wittgenstein)?

    Are you saying that the new definition thus encompassing things like air conditioners and ChatGPT is breaking the normal boundaries?

    Are you saying something really self-referential to Wittgenstein like, we can't use "phenomenology" of consciousness because it is private and cannot be shared?

    What exactly are you saying, I guess? I have not figured out the rules of your language game here so I can play.
  • Banno
    25.3k
    What exactly are you saying, I guess?schopenhauer1
    This:
    You expresses some agreement with the phenomenological approach to defining consciousness. I have pointed out that it's a useless definition. It cannot help us to decide if ChatGPT, creativesoul, or your air conditioner are conscious.Banno

    Since this isn't getting anywhere, might best just leave it.
  • schopenhauer1
    11k
    Since this isn't getting anywhere, might best just leave it.Banno

    Your tendency to dismiss gets in the way of you legitimately answering the question as that answer is up for interpretation and I gave you what I thought you can be getting at here:

    I don't see how that is contradicting rather than supporting what I am saying.

    Are you saying that the definition has thus changed because it is being used thus in a language community (pace Wittgenstein)?

    Are you saying that the new definition thus encompassing things like air conditioners and ChatGPT is breaking the normal boundaries?

    Are you saying something really self-referential to Wittgenstein like, we can't use "phenomenology" of consciousness because it is private and cannot be shared?

    What exactly are you saying, I guess? I have not figured out the rules of your language game here so I can play.
    schopenhauer1

    All of those can be interpreted from that and you have not told me which one is correct other than pointing to something I told you had multiple interpretations. So instead of explaining you dismiss. Not great for a forum where I can only glean from words on a post. But BannoGPT is programmed a certain way I guess.
  • Banno
    25.3k
    See, I'm not making any of the claims you suggest. So I can't choose one.

    I'm just noting that you expresses some agreement with the phenomenological approach to defining consciousness, and then I showed why it is not much help, using a reductio argument: we agree that air conditioners are not conscious, yet the phenomenological approach cannot show that this is so.

    That's all. :meh:
  • schopenhauer1
    11k

    Sounds like you're stuck on analytic mode!

    I'm just noting that you expresses some agreement with the phenomenological approach to defining consciousness,Banno

    Got it. That is true. I did express that agreement.

    and then I showed why it is not much help, using a reductio argument: we agree that air conditioners are not conscious, yet the phenomenological approach cannot show that this is so.Banno

    Yes indeed. The phenomenological is not a methodological statement but an ontological one. It is not much help in determining consciousness, just defining it.

    And you are alluding (even if not intentionally) to this one:
    Are you saying something really self-referential to Wittgenstein like, we can't use "phenomenology" of consciousness because it is private and cannot be shared?schopenhauer1

    We all know that we cannot "see" the internal aspect of someone else. There is no way to tell if something has an internal aspect. This doesn't mean it doesn't exist. And contra early Wittgenstein, we can talk about it even we just can't point to it.
  • Banno
    25.3k
    :meh:
  • Banno
    25.3k
    So who got to the end of the article? Wolfram begins to be a bit more philosophical:

    So how is it, then, that something like ChatGPT can get as far as it does with language? The basic answer, I think, is that language is at a fundamental level somehow simpler than it seems. And this means that ChatGPT—even with its ultimately straightforward neural net structure—is successfully able to “capture the essence” of human language and the thinking behind it. And moreover, in its training, ChatGPT has somehow “implicitly discovered” whatever regularities in language (and thinking) make this possible.

    The success of ChatGPT is, I think, giving us evidence of a fundamental and important piece of science: it’s suggesting that we can expect there to be major new “laws of language”—and effectively “laws of thought”—out there to discover. In ChatGPT—built as it is as a neural net—those laws are at best implicit. But if we could somehow make the laws explicit, there’s the potential to do the kinds of things ChatGPT does in vastly more direct, efficient—and transparent—ways.

    Overreach, I think. But what do others make of this?
  • Banno
    25.3k
    Wolfram's conclusion is that "human language (and the patterns of thinking behind it) are somehow simpler and more “law like” in their structure than we thought".

    Yet human language thrives by breaking the rules.

    Perhaps we might have moved a step closer, not to setting out the rules of language and of thought, but to showing that there are no such rules.
  • bongo fury
    1.7k
    A semantic grammar is a semantic syntax. So not necessarily a true semantics. Not necessarily joining in the elaborate social game of relating maps to territories. Not necessarily understanding. Possibly becoming merely (albeit amazingly) skilled in relating maps to other maps.
12Next
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.