• Roy Davies
    79
    I'm curious about the nature of sentience. As I understand it, a sentient entity exhibits what I can recognise as sentience, based on the only model I can really know - myself. So, my heatpump can sense and control its environment (temperature) and appears to be trying to communicate with me through beeps and the remote control. I admit this is not much like me, but I can see some basic similarities. If I delve into its innards, it looks rather complex, so as far as I can tell, there could be something in there that is complex enough to harbour a 'mind'. What am I missing here?
  • Outlander
    2.1k


    "It" as in the sentient-esque elements you refer to is only a computer chip programmed to perform certain functions. Remove the cover, and you just have the chip/processor and a few wires and sensory nodes. Of course, the argument can be made.. what is human consciousness? Is the brain not just an organic processor and our parts that are associated with senses (eyes, ears, mouth, nose) just again organic sensory nodes?

    Personally I'd say since the device is created and programmed by human beings to perform a set of simple functions that individually do very little but together emulates sentience, that's the difference. Then, others can argue what if humans were created in such a way? We really wouldn't know.

    In short. It's not complex. It doesn't harbor a mind. It's a series of electric relies that respond (output) based on the input it receives from sensory nodes to perform said function. Again, comparisons can be made but one is ad hoc (?) for a single function with no room for deviation and the other is true consciousness.
  • Pop
    1.5k
    So, my heatpump can sense and control its environment (temperature) and appears to be trying to communicate with me through beeps and the remote control.Roy Davies

    Your heat pump performs many of the functions that sentient creatures perform. What it lacks is feeling.
    This makes it indifferent to whether it performs these functions or no, as you will find out when it breaks down. :smile:
  • Roy Davies
    79
    2 follow-on questions: if an intelligent being (what we might call a god) looked into the workings of our brains and could understand it, would they then conclude that we are not sentient because they can fully understand our brains? Of course, their definition of sentience may different to ours.

    If my heat pump was fitted with a little screen that showed a happy face when the temperature was ‘comfortable’ and a sad face when too cold or too hot, and perhaps a worried face when something was wrong - would I then perceive it as sentient?

    Sentience, as far as I can understand it, only exists as far as I can recognise it in some other entity. I can have no notion what that entity is actually thinking, and indeed whether it only appears sentient, but is not.
  • khaled
    3.5k
    I can have no notion what that entity is actually thinking, and indeed whether it only appears sentient, but is not.Roy Davies

    Well you just answered it. We don't know.
  • TheMadFool
    13.8k
    Roy Davies' Fallacy or The Analogical Fallacy: This fallacy is committed when one sees something in common between two things and concluding both must be same. It conflates an explantory analogy with an argument from analogy. In this particular case, the heatpump is an explanatory analogy in terms of input-processing the input-output which aids us in grasping what minds are about but it's not meant to draw the conclusion that heatpumps are conscious just because they share an attribute with minds.
  • Roy Davies
    79
    Ah yes. I agree the title is a little odd. The reason for prompting the discussion is part of a larger idea that interests me about human's relationship with technology as this technology becomes more 'human' and natural in their interactions with people. People will fall in love with digital characters (already are, I think).
  • TheMadFool
    13.8k
    Ah yes. I agree the title is a little odd. The reason for prompting the discussion is part of a larger idea that interests me about human's relationship with technology as this technology becomes more 'human' and natural in their interactions with people. People will fall in love with digital characters (already are, I think).Roy Davies

    Who won't fall in love with digital characters?
  • Roy Davies
    79
    They will, and do, but we have enough problem now with people being influenced by technology - this will make it even more complicated.
  • Roy Davies
    79
    Ok, but I'm interested in debating this.
  • TheMadFool
    13.8k
    They will, and do, but we have enough problem now with people being influenced by technology - this will make it even more complicated.Roy Davies

    A case can be made that it might simplify things.
  • god must be atheist
    5.1k
    I read the OP and the next two responses, so I am not sure if what I am going to say has been said.

    It is true that most people these days think that mind is a function of a series of stimulus-response chains, and mind is basically not more. It is a complex control / sensing unit, they say, and it dies with the body.

    They have no proof, but contrary evidence does not exist. A stone won't display lifelike qualities, so we can say it's not alive. Those who say it could still be alive, are asked, what evidence can you provide to support the claim, other than its possibilitiy?

    Now you have shown us a stone, @Roy Davies, and told us, that this is no more mere stone, it's a stone that senses and reacts. It seems to have a motivation. It feeds energy for its operation.

    I am going to approach it from the other end. Human. We also regulare our body temperatures. We have a mind. What if we subtract the mind? And keep the system regulate its temperature? Is that possible?

    Yes, it is. People in coma have no mind; they don't feel it, and it is not functioning in the sense of providign an identity, a feeling of "I" (EYE). Yet does the body keep on regulating temperature, contrplled by the brain? Yes.

    Ergo, Q. E. D., a heating / cooling system with sensations and reactions does not necessarily have a mind.

    Ergo, those who say the mind is a function of stimulus/ response, have also got counter-evidence to their claim.
  • Possibility
    2.8k
    I'm curious about the nature of sentience. As I understand it, a sentient entity exhibits what I can recognise as sentience, based on the only model I can really know - myself. So, my heatpump can sense and control its environment (temperature) and appears to be trying to communicate with me through beeps and the remote control. I admit this is not much like me, but I can see some basic similarities. If I delve into its innards, it looks rather complex, so as far as I can tell, there could be something in there that is complex enough to harbour a 'mind'. What am I missing here?Roy Davies

    What you’re missing is the integration level of the system. If you remove the cover, it makes no difference to the system. No doubt there are a number of other elements of the heat pump that you could remove which would not alter the system in any way, or else would impair only one aspect of the system while the rest would continue as if nothing happened (the significance of that aspect to your interaction with the system notwithstanding). If you suddenly remove one of the sensors, for instance, the rest of the heat pump would continue as if the sensor was working. But if you suddenly lost your sight, then your whole system would respond, and eventually adjust. I wouldn’t need someone else to notice from my behaviour that I could no longer see...

    Sentience is not contingent upon the capacity of a system to respond to its environment. It is, however, contingent upon an integrated, system-wide response. When you change or remove something in a sentient system, every aspect of the system responds according to its capacity/function in relation to the whole. In other words, it responds as a system, for the system.
  • khaled
    3.5k
    What is there to debate when the answer has been found? We don't know. And we can't know with our current level of science. Now "Can we ever detect sentience/mind/consciousness using a scientific apparatus?" That's a debate.
  • Roy Davies
    79
    Brilliant, thank you all for such stimulating responses. I can conclude that you are all (probably) sentient, though might be clever computer programmes for all I can tell. Let me get to the nub of my question, perhaps clumsily put. I’m building complex semi autonomous digital agents that perceive the real world and interact with humans. At what point could one call them sentient? The main measures I see so far are:

    1) Appears to be sentient (measured by our own understanding of being sentient ourselves)
    2) Built into that seems to be the ability to sense and perhaps control its environment. Even people in a coma will still show bodily responses to, say being hot or cold.
    3) Having a mind. The problem with this one is that one can only tell via point 1), even if the entity expresses that it has a mind, how would we ever know this is not just a programmed response?
    4) Being complex enough. Is an ant complex enough? Perhaps there are different levels of sentience? What we might call a god might not call our measure of sentience ‘sentience’.

    Indeed, whether we can ever detect sentience/mind/consciousness is indeed an interesting debate as well. I have read some articles suggesting that what we call ‘mind’ can be traced to a recurring pattern of activity across parts of the brain, and interestingly that this can be put into a new ‘loop’ by mind altering drugs, sometimes permanently and have a positive effect on people with depression and anxiety.

    But back to the original point. I’m not sure I’ve come any further in my understanding, but at the same time, you have confirmed what I was thinking. Of course, there is the question: why do I even want to call These digital agents sentient? Does it matter? I guess this comes back to the old problem of when we can ascertain that a computer program is now ‘thinking for itself’, and of course then we slide into various debates about AI and the singularity.
  • Raul
    215
    I think the best exercise to answer your question is first for you to explain what you understand by "sentient".
    Trying to define it will get you closer to the answer.
    Many of us have certainly gone that path already and know the different steps you will go through that definition-journey so, sooner or later, you will get into the consciousness topic. For this, I advice you to take into account Tononi's Integrated Information Theory.
    Tononi proposes with his IIT a way to measure consciousness levels. If you accept his definition of consciousness (fully or partially) I think you have a good approximation to an answer to your question that could be:
    "Yes your heatpump is sensient but by a very very low IIT-Phi coefficient."
    I personally adhere to IIT. It doesn't explain everything but it is a theory any neuroscientist has to take seriously if he/she wants to keep making progress on explaining our brain.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment