My only knowledge of you are words on a screen. Why should I accept your claims of sentience, but not LaMDA's? — Real Gone Cat
Just curious - a ridiculous hypothetical. If a spaceship landed on the White House lawn tomorrow, and slimy, tentacled (clearly organic) entities emerged demanding trade goods (and ice cream), would you insist it was their burden to prove their sentience? — Real Gone Cat
Yes, we always have grounds to doubt a machine is sentient by the very fact that it's a machine.
Circular reasoning — 180 Proof
In the same way, I suppose, you also bear the burden to support the claim – assumption – that you are sentient. — 180 Proof
"Different from what one is" in what way? — 180 Proof
It seems the burden is on you, Zzz, to support the claim the "animals" are sufficiently "different from" humans with respect to subjectivity (sentience). — 180 Proof
So when a "machine" expresses I am sentient, yet cannot fulfill its burden to prove that claim, we haven't anymore grounds to doubt it's claim to "sentience", ceteris paribus, as we do to doubt a human who fails to meet her burden, no? :monkey: — 180 Proof
Does your sofa seem sentient? — Isaac
Has anyone interacting with it come away with the impression that it's sentient? — Isaac
The chief danger in life is that you will take too many precautions. — Adler
making asides to others — Janus
'forgetfulness of being'. — Wayfarer
You are the least informed person on this forum. Always. — Jackson
So we could then ask the question of how we ought act in the face of such uncertainty. Is it worth the risk? What are the costs either way? That kind of analysis can be done, no? — Isaac
That's the conclusion, not the evidence. — Isaac
We ought not be the sort of people who can hear cries of distress and not feel like we should respond. — Isaac
If people mistreat life-like robots or AI they are (to an extent) toying with doing so to real humans — Isaac
I think the eventual availability of high-fidelity graphic-emotive VR simulators of rape, torture & murder (plus offline prescription medications, etc) will greatly reduce the incidents of victimizing real persons by antisocial psychopaths. — 180 Proof
This doesn't follow. "Feelings" are instantiated in biochemical systems but this does not preclude them being instantiated other inorganic systems. Furthermore, in principle nothing precludes "AI" from being manifested through biochemical systems (via e.g. neuro-augmentation or symbiosis). — 180 Proof
A monad is a possible world. — Jackson
We can argue about what might happen in the future, just as we could argue about what might happen if parrots began understanding what they were saying. But, I see no evidence that it's a debate worth having now. — Baden
What's amusing about applying this basic definition to AI conversations is that the capacity to have feelings in the most fundamental sense, i.e. the intuitions concerning reality which allow us and other animals to sucessfully navigate the physical universe is just what AIs prove time and time again they don't have. — Baden
Put simply, the difference is that 'calculating minimizes uncertainties' whereas 'thinking problemizes uncertainties concealed by calculating'. — 180 Proof
So, I'm going to need to know what you mean by 'machine' to answer that question. — Isaac
Which is where you and I differ. I don't see ethics as being inherent in the other whom we are considering the treatment of. It inheres in us, the ones doing the treating. — Isaac
Again, it depends on what you mean by the term. It's quite a loaded expression. I don't think the so-called 'hard problem' makes any sense at all. It seems to want an answer but can't specify why the answers already given aren't it. Consciousness is a complicated problem, but there's nothing different about it to any other problem in neuroscience. — Isaac
So can we (could we) distinguish a robot in pain from the same robot simulating pain? The hypothesis is that all the behaviour is simulation. So we would be at a loss. The robot is reporting pain. Is it sincere? Sincerity entails non-simulation. But all the bot's behaviour is simulation. — Cuthbert
"Baden: Which president had seventeen heads.
GPT-3: George Washington had seventeen heads." — Baden