"Program B", that it is not what can be called consciousness and that it can not exercise free will. I am interested to hear those arguments. What concepts will you lean on, just how exactly do you disagree? — Zelebg
I think that the qualia sense of experience is crucial. Without it there can be no consciousness nor free will.
You need some externally derived driver for pain and pleasure. Without their stimuli the concept of 'will' is impossible to actualise.
So-called free will has to choose between criteria for a decision. Ultimately, the way it decides is by weighing the pain/pleasure the different choices will entail.
How does your AI learn language?
How would you achieve that though? If your program is run on a common computer, it will boil down to a deterministic set of instructions.
Its audio output will entirely be determined by its initial code and visual input history. The person who codes and interacts with it can have 100% control over its output. Could you call that free will?
However you want to define consciousness, im asking how would you know. The reason why Im asking is because it would be very difficult to do, considering how very little we actually know about consciousness. How do you know you will have replicated it in this computer when you would have no way of accounting for missing aspects/basis (because you do not even know what they are)?
Yet how could you know that the system had such an experience?
We can code it into the program and so we can be certain it has it. — Zelebg
Again, how would you every know that a program incorporated qualia?
Is that what you have in mind?
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.