• Banno
    25.1k
    Oh, you mean it's not objective! So that's it. No wonder, then.Wayfarer

    No, I mean that the objective-subjective distinction does not help.
  • Deletedmemberzc
    2.5k
    I have Being and Time and pick it up from time to time. It doesn't disappoint. No time to read it cover to cover. Maybe if I find myself in a nursing home some day, with endless time on my hands.
  • Banno
    25.1k
    Again, your point remains obscure.

    Are you claiming that LaMDA does not have a subjective life, but that you do, and yet that this mooted subjective life is not observable by anyone but the subject?

    Again, that does not give us a basis for supposing that LaMDA does not have a subjective life.

    And again, this line of reasoning takes us nowhere.
  • Wayfarer
    22.6k
    No, I mean that the objective-subjective distinction does not help.Banno

    I think if you frame it properly, it's very important. I found a current analytical philosophy book that talked about this, I'll try and remember it.

    Are you claiming that LaMDA does not have a subjective life, but that you do, and yet that this mooted subjective life is not observable by anyone but the subject?Banno

    I know you asked that to someone else, but I'd like to offer a response.

    Empirically speaking, the only instances of conscious life that can be observed are living organisms, which exhibit conscious activity in various degrees, with simple animals being at the lower end of the scale and higher animals and h. sapiens at the higher end.

    It's still an open problem what makes a living being alive and what the nature of mind or of life really is. But I think it's perfectly reasonable to assert that computer systems don't possess those attributes at all. They don't display functional autonomy and homeostasis, for example.

    I don't think it's a leap to claim that the only subjects of experience that we know of in natural terms are organisms, and that computers are not organisms. We don't know exactly what makes a living being alive, but whatever that is, computers do not possess it. So the insistence that this is something that has to be proved is a fatuous claim, because there's no reason to believe that there is anything to prove. That's why I said the burden of proof is on those who claim that computers are actual subjects of experience.

    I also note in reference to the subject of this OP that experts in AI are universal in dismissing Blake Lemoine's claim, that his employer has repeatedly suggested that he undergo a psychiatric examination and suspended his employment, and that the only place where his purported evidence can be viewed is on his own blog.

    So enough arm-waving already.
  • Banno
    25.1k
    Empirically speaking, the only instances of conscious life that can have be observed are living organisms,Wayfarer

    But that is exactly what is at issue: is LaMDA an instance of non-biological consciousness.

    Further, it is not clear that functional autonomy and homeostasis are prerequisites for consciousness.

    And so on throughout that argument. None of the things listed is decisive in our making a decision as to whether @Wayfarer or LaMDA is conscious.
  • Deletedmemberzc
    2.5k
    :up:


    If Wayfarer is what I am - a human being - if Wayfarer is a human being - Wayfarer has subjective experiences, Wayfarer is sentient.

    Wayfarer is a human being.


    If someone says a machine might be sentient - might have subjective experiences - the burden is on him to support that claim.





    The case with animals - with anything different from what one is - a human being - is similar to the case of a machine.

    If someone says a dog might be sentient, the burden is on him to support that claim. This shouldn't be difficult in light of the obvious similarities between dogs and human beings. But some people will insist dogs - animals - are insentient. That's inevitable: the banality of evil.

    If someone says a virus, an amoeba, might be sentient, the burden is on him to support that claim.

    If someone says a flower, a rock, a machine might be sentient - might have subjective experiences - the burden is on him to support that claim.


    The array of proofs presented and conclusions accepted will be - will likely appear to some set of others as - to some extent arbitrary or idiosyncratic.
  • Deletedmemberzc
    2.5k
    @Wayfarer
    @Isaac

    The moral issue in the above approach is clear: one may choose to exclude other human beings who seem in some sense dissimilar from oneself. History, in a word.

    On the other hand: there is no other approach to the subjective short of assuming all things - viruses, amoebae, flowers, rocks, machines, sofas, tables - are sentient and demanding each case be disproven. The result will inevitably be in some sense arbitrary or idiosyncratic.

    In short, if a machine, then why not a virus, an amoeba, a flower, a rock, a sofa, a table, and so on ad infinitum? No one can live this way: no one does live this way: paralysis is the result.

    Hence:

    The chief danger in life is that you will take too many precautions. — Adler

    Too many precautions: a recipe for a neurotic way of life.
  • Isaac
    10.3k
    which exhibit conscious activityWayfarer

    And what is 'conscious activity'?

    there's no reason to believe that there is anything to proveWayfarer

    Lemione is giving us a reason. You dismiss it as most likely a hoax, then say that there no reasons to consider whether AI is conscious. Of course there are no reasons, you dismiss them as they arise.

    I recall a thread of your some time back where your were imploring us to take people at their word when they were talking about past lives - "I spoken to him at length and he seems to know things he couldn't possibly know..." - and asking us to take it as possible evidence for past lives, ie to not dismiss it out of hand just because it doesn't fit in our physicalist world-view. Yet here your are doing exactly that. Refusing to take "She seems conscious to me..." as a reason to consider that position.

    On the other hand: there is no other approach to the subjective short of assuming all things - viruses, amoebae, flowers, rocks, machines, sofas, tables - are sentientZzzoneiroCosm

    Does your sofa seem sentient? Has anyone interacting with it come away with the impression that it's sentient?

    Of course there's another approach. Assume anything which seems sentient, is sentient. since we cannot come up with any objective measure of what is sentient, the only reason we have anything to describe at all is that some things seem sentient. It's you and @Wayfarer who want to add some religious belief that there's a new category of thing which seems sentient but 'really' isn't. Up until now we've been getting along perfectly well just taking it that anything which seems sentient probably is. We haven't previously gone around checking for 'homeostasis' or 'functional autonomy', the word 'sentient' found it's use entirely based on things which seemed to possess that characteristic.

    You're now trying to 'reverse engineer' that definition and make it describe something which excludes AI, but the criteria you're introducing were never the criteria applied to the use of the word 'sentient' in the first place, it was invoked entirely to describe things which seemed a certain way.
  • Wayfarer
    22.6k
    It's you and Wayfarer who want to add some religious beliefIsaac

    What religious belief? Haven't said anything about religion in this entire thread.
  • Isaac
    10.3k
    What religious belief? Haven't said anything about religion in this entire thread.Wayfarer

    It was a descriptive term, not an ascriptive one. Belief in exceptionalism of humans originates from religion, ie it is a religious belief. It doesn't mean you have to ascribe to that religion in order to believe it, it's just a description of who 'owns copyright' on that type of belief, so to speak. Pre-religious tribes (pre- modern religion) are almost universally animistic.
  • 180 Proof
    15.4k
    If Wayfarer is what I am - a human being - if Wayfarer is a human being - Wayfarer has subjective experiences, Wayfarer is sentient.ZzzoneiroCosm
    A definition, not a fact.

    If someone says a machine might be sentient - might have subjective experiences - the burden is on him to support that claim.
    In the same way, I suppose, you also bear the burden to support the claim – assumption – that you are sentient.

    The case with animals - with anything different from what one is - a human being - is similar to the case of a machine
    "Different from what one is" in what way?

    "What one is" by definition? ... or by description? ... or by hypothetico-deduction?

    It seems the burden is on you, Zzz, to support the claim the "animals" are sufficiently "different from" humans with respect to 'subjectivity (sentience)'. However, if 'subjectivity' is only accessible to a subject – by definition – then there is no public truth-maker corresponding to the claim 'one has subjective experiences (one is sentient)', therefore humans being "different from" animals on that basis is illusory (like a cognitive bias). :chin:

    So when a "machine" expresses I am sentient, yet cannot fulfill its "burden to support that claim", we haven't anymore grounds to doubt it's claim to "sentience", ceteris paribus, as we do to doubt a human who also necessarily fails to meet her burden, no? :monkey:
  • Deletedmemberzc
    2.5k
    Does your sofa seem sentient?Isaac

    No. But neither does LaMDA.

    Incidentally, a schizophrenic can experience a kind of pan-sentience. The objects are watching me. The mind is capable of experiencing or conceiving of the world as pan-sentient.

    Has anyone interacting with it come away with the impression that it's sentient?Isaac

    Possibly. Possibly one person. I don't have access to his psychological history so I don't know what conclusion to draw from this fact.

    As a priest I can suppose he believes 'god' is sentient. That doesn't help his case. That suggests the possibility that he assigns sentience in a less than rational way.
  • Deletedmemberzc
    2.5k
    In the same way, I suppose, you also bear the burden to support the claim – assumption – that you are sentient.180 Proof

    I don't think so. There is no universal assumption of solipsism that I bear the burden of refuting. No matter what a philosopher playing at solipsism might say.

    Refer to the famous quote from Russell on solipsism, above.

    "Different from what one is" in what way?180 Proof

    A different species of creature. Unless you want to deny the significance of a specie-al distinction. That doesn't have the ring of a strong position to me.

    It seems the burden is on you, Zzz, to support the claim the "animals" are sufficiently "different from" humans with respect to subjectivity (sentience).180 Proof

    No, because if they're not seen as sufficiently different then we can suppose they're sentient like me. Nothing to prove so no burden.

    So when a "machine" expresses I am sentient, yet cannot fulfill its burden to prove that claim, we haven't anymore grounds to doubt it's claim to "sentience", ceteris paribus, as we do to doubt a human who fails to meet her burden, no? :monkey:180 Proof

    Yes, we always have grounds to doubt a machine is sentient by the very fact that it's a machine. No other machine is thought to be sentient; every other machine is thought to be insentient. In such a case of special pleading, the burden must be on the person making the odd-duck claim.
  • Deletedmemberzc
    2.5k



    It's important to keep in mind that there's likely some set of individuals who want machines to be sentient. For example, a priest-engineer immersed an a cutting-edge AI project.


    There is a potential to derive emotional fulfillment - to fill Frankl's existential vacuum - with the consciousness of a sentient machine. In this age of existential angst and emptiness, the power of the existential vacuum should never be underestimated. A possible escape from the Void can take hold of a person like a religious fervor.
  • Baden
    16.3k


    Yes, requests to disprove LaMDA is sentient, disprove my phone has feelings because it talks to me, disprove the flying spaghetti monster, disprove carrots feel pain etc. are time-wasters. There is zero evidence of any of the above.
  • 180 Proof
    15.4k
    :ok: :sweat:
    There is no universal assumption of solipsism that I bear the burden of refuting.ZzzoneiroCosm
    Non sequitur.

    Unless you want to deny the significance of a specie-al distinction.
    Strawman & burden-shifting.

    No, because if they're not seen as sufficiently different then we can suppose they're sentient like me.
    Appeal to ignorance.

    Yes, we always have grounds to doubt a machine is sentient by the very fact that it's a machine. 
    Circular reasoning.
  • Deletedmemberzc
    2.5k
    Yes, we always have grounds to doubt a machine is sentient by the very fact that it's a machine. 
    Circular reasoning
    180 Proof

    Nah. Just a reasonable assumption based on what we have come to know about machines. Anyone claiming a machine might be sentient to my view very obviously bears the burden of proof.
  • Real Gone Cat
    346


    Just curious - a ridiculous hypothetical. If a spaceship landed on the White House lawn tomorrow, and slimy, tentacled (clearly organic) entities emerged demanding trade goods (and ice cream), would you insist it was their burden to prove their sentience?

    It might sound laughable, but it goes to the core of the matter - Is sentience judged by appearance or behavior? My only knowledge of you are words on a screen. Why should I accept your claims of sentience, but not LaMDA's?
  • Deletedmemberzc
    2.5k
    Just curious - a ridiculous hypothetical. If a spaceship landed on the White House lawn tomorrow, and slimy, tentacled (clearly organic) entities emerged demanding trade goods (and ice cream), would you insist it was their burden to prove their sentience?Real Gone Cat

    I would treat them as I would any other seemingly intelligent creature. I don't take issue, as some of the others do, with drawing a line between creatures and machines.

    If later it was discovered that this creature was a machine, the question of sentience would be cast into doubt.

    Note that there is no denial of sentience in this attitude. Just a reasonable assumption that machines are insentient coupled with a burden to prove otherwise based on what we have come to know about machines.
  • Deletedmemberzc
    2.5k
    My only knowledge of you are words on a screen. Why should I accept your claims of sentience, but not LaMDA's?Real Gone Cat

    Here's why:

    Do you have an unshakable conviction - a sense of certainty - that a human being is typing these words?

    Do you have an unshakable conviction - a sense of certainty - that this human being is sentient?

    If you're going to be honest - if you're not playing a philsophical parlor game - if you're not schizophrenic or in some other way mentally (let's say) different - the answer to both of these questions is - yes.

    Solipsism can never be disproven, only dismissed.
  • Deletedmemberzc
    2.5k
    Is sentience judged by appearance or behavior?Real Gone Cat

    Sentience is an assumption based on the interactions of a community of same-species creatures.


    Solipsism can never be disproven, only dismissed.
  • Real Gone Cat
    346


    Where I think the situation gets interesting is in regard to ethics. Does one act toward intelligent but possibly non-sentient beings as one does toward sentient beings? If so, then one must treat LaMDA's claims of sentience with respect and act as if true. If one judges LaMDA to be intelligent, that is.
  • Deletedmemberzc
    2.5k
    If one judges LaMDA to be intelligent, that is.Real Gone Cat

    The word 'intelligent' is malleable.

    Some folks would say intelligence requires sentience. Some folks would call a smartphone intelligent. So it depends on the definition and the context of usage.
  • 180 Proof
    15.4k
    Anyone making an extraordinary claim about anything bears the burden of proof so your truism is an irrelevant non sequitur in this context.
  • Deletedmemberzc
    2.5k
    Anyone making an extraordinary claim about anything bears the burden of proof180 Proof

    Exactly.

    Anyone claiming a machine might be sentient - an extraordinary claim - bears the burden of proof.
  • Real Gone Cat
    346
    I would treat them as I would any other seemingly intelligent creature.ZzzoneiroCosm

    Ah, there's the rub.
  • 180 Proof
    15.4k
    I guess that's all you've got. :ok: :smirk:
  • Deletedmemberzc
    2.5k


    Do you not agree that this claim - X machine might be sentient - is extraordinary?
  • Baden
    16.3k
    There is an issue of frameworks here. What's the justificatory framework for connecting the production of language with feelings and awareness, i.e. sentience? Mine is one of evolutionary biology. We expect beings who have been built like us over millions of years of evolution to be like us. So for those who posit a connection between the production of a fascimile of human language and the presence of feelings, you also need a framewok. If you don't have that, you are not even at step one of justifying how the former can be indicative of the latter.

    Again, sentience is the state of having feelings/awareness. It is not the outputting of linguistically coherent responses to some input. It's more about the competitive navigation of the constraints of physical environments resulting in systems that need to adapt to such navigation developing reflexive mental processes beneficial to the propagation of their reproductive potentialities as instantiated in RNA/DNA.

    If your framwework for sentience is the outputting of a fascimile of human language, it's a very impoverished and perverse one. Apply Occam's Razor and it's gone. Sentience necessitates feelings not words. I mean, let's realize how low a bar it is to consider appropriate outputs in mostly gramatically correct forms of language to some linguistic inputs (except challenging ones) to be evidence of feelings. And let's note that the Turing Test is a hangover from a behaviourist era when linguistics and evolutionary biology were nascent disciplines and it was fashionable to consider people being like machines/computer.

    My understanding of the term 'sentience' in itself logically imposes a belief I am sentient and reasoning by analogy justifies considering those like me in fundamental biological ways that are scientifically verifiable through anatomical, evolutionary, and neuroscientific testing to also be sentient. I do not believe I am sentient because I produce words and I do not have any justification for believing other beings or things are sentient simply because they produce words. Again, sentience is defined by feelings and awareness, which in human beings over evolutionary time happened to lead to the production of language. You can't run that causal chain backwards. The ability to produce (a fascimile of) language is neither a necessary nor sufficient condition of sentience nor, without some justificatory framework, is it even any evidence thereof.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment