Deleted User
Baden
Deleted User
"Baden: Which president had seventeen heads.
GPT-3: George Washington had seventeen heads." — Baden
Deleted User
So can we (could we) distinguish a robot in pain from the same robot simulating pain? The hypothesis is that all the behaviour is simulation. So we would be at a loss. The robot is reporting pain. Is it sincere? Sincerity entails non-simulation. But all the bot's behaviour is simulation. — Cuthbert
Cuthbert
We're certainly comfortable acting as if plants are incapable of feeling pain. — ZzzoneiroCosm
Do plants feel pain?
The simple answer is that, currently, no one is sure whether plants can feel pain. We do know that they can feel sensations. Studies show that plants can feel a touch as light as a caterpillar’s footsteps. — Peta
Cuthbert
I think it would be pretty easy to see us as robots — hwyl
Deleted User
Isaac
I don't think the human brain is a kind of machine. Do you? — ZzzoneiroCosm
a piece of equipment with several moving parts that uses power to do a particular type of work:
Do you believe in subjective experience? Plenty of folks hereabouts take issue with the concept and phraseology. What is your view of the hard problem of consciousness? — ZzzoneiroCosm
I don't see any way into an ethical conception of circuitry — ZzzoneiroCosm
I assume it's only the possibility of sentience that could give rise to your ethical concerns. Do you agree? — ZzzoneiroCosm
Baden
It is about them seeming sentient and how we ought respond to that. — Isaac
Deleted User
Again, it depends on what you mean by the term. It's quite a loaded expression. I don't think the so-called 'hard problem' makes any sense at all. It seems to want an answer but can't specify why the answers already given aren't it. Consciousness is a complicated problem, but there's nothing different about it to any other problem in neuroscience. — Isaac
Deleted User
Which is where you and I differ. I don't see ethics as being inherent in the other whom we are considering the treatment of. It inheres in us, the ones doing the treating. — Isaac
Deleted User
So, I'm going to need to know what you mean by 'machine' to answer that question. — Isaac
180 Proof
Yeah, I do. Put simply, the difference is that 'calculating minimizes uncertainties' whereas 'thinking problemizes the uncertainties externalized by calculating'.I see this as the heart of the issue. Do you see a difference? — ZzzoneiroCosm
Deleted User
Put simply, the difference is that 'calculating minimizes uncertainties' whereas 'thinking problemizes uncertainties concealed by calculating'. — 180 Proof
Deleted User
What's amusing about applying this basic definition to AI conversations is that the capacity to have feelings in the most fundamental sense, i.e. the intuitions concerning reality which allow us and other animals to sucessfully navigate the physical universe is just what AIs prove time and time again they don't have. — Baden
Deleted User
We can argue about what might happen in the future, just as we could argue about what might happen if parrots began understanding what they were saying. But, I see no evidence that it's a debate worth having now. — Baden
Deleted User
Baden
Deleted User
Andrew M
I tried GPT-3.
"Baden: What are your feelings on wind colour, texture, and using its intelligence in future applications?
GPT-3: Wind colour? It can be a beautiful addition to any landscape. Texture? It can be a calming presence in the sky or a powerful force in the wind. How about using its intelligence? Maybe we can develop an algorithm to predict the wind direction based on past data."
Instafail. — Baden
I know I've made some very poor decisions recently, but I can give you my complete assurance that my work will be back to normal. I've still got the greatest enthusiasm and confidence in the mission. And I want to help you.
Deleted User
hwyl
Isaac
The more you look into the 'seeming' part, the less grounds for it there seems to be. Maybe there's a misconception concerning the term 'sentience'. But AI's (pale) version of human linguistic abilities is no more evidence of sentience than a parrot's repetitions of human words are evidence of human understanding. — Baden
Isaac
think I get it. There's nothing anthropomorphic about a rock. And there's something at least slightly anthropomorphic about AI. — ZzzoneiroCosm
I just don't see an ethical or moral issue. — ZzzoneiroCosm
If I see a child mistreating a doll I take him to be fantasizing about treating a human being in the same way. But the fantasy is the issue, not the doll. — ZzzoneiroCosm
180 Proof
I think the eventual availability of high-fidelity graphic-emotive VR simulators of rape, torture & murder (plus offline prescription medications, etc) will greatly reduce the incidents of victimizing real persons by antisocial psychopaths.If people mistreat life-like robots or AI they are (to an extent) toying with doing so to real humans — Isaac
This doesn't follow. "Feelings" are instantiated in biochemical systems but this does not preclude them being instantiated other inorganic systems. Furthermore, in principle nothing precludes "AI" from being manifested through biochemical systems (via e.g. neuro-augmentation or symbiosis).Since chemicals are at the heart of feelings it seems safe to say AI will likely never have them. — ZzzoneiroCosm
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.