Artificial intelligence AI are in the news again, and it got me wondering what the most common sense way to seeing these machines was. Animals have consciousness but not reasoning like we do. Artificial intelligence does or may someday have the reasoning we have, but does this mean they are conscious? I mean, we can imagine consciousness without reason, so why not reasoning without consciousness? I haven't seen this considered before, so I thought I'd throw it out there — Gregory
Right now so called AI can perform specific tasks based on extensive programming. At the height of its complexity, these tasks can be generalized to what may be called "abilities": carry on a conversation, for example. So the question is, if we think of AI as being conscious, is this a specific ability which we confer on it? That only begs the question of what consciousness is. In that case, if we think of AI as attaining consciousness, it must be in the context of us conferring more and more task specific capabilities such that, in a cumulative fashion, new generalized abilities emerge, at the apex of which emerges consciousness, the ultimate general ability. And if it is an emergent property then we would no more have created that consciousness than we created the matter out of which the computer was formed.
As to reason without conscious, in abilities-centric characterization just offered, I think reason and consciousness must be synonymous. Viz, a computer that displays the general ability of "carrying on a conversation" (in the context of the Turing test say) is not really reasoning, just executing a whole lot of algorithms very quickly. You could not call that reasoning unless it were at the same time conscious.