• TheMadFool
    13.8k
    Could you repost the picture. Thanks.
  • VagabondSpectre
    1.9k


    Careful what you wish for...

    PastDismalKookaburra-size_restricted.gif
  • TheMadFool
    13.8k
    Careful what you wish for...VagabondSpectre

    What is it that you think my wish is? :chin: Are you saying I might be banned? :sad:
  • VagabondSpectre
    1.9k


    mods should be pleased to see feelings flare up in the forum - insults, rejoinders, expletives, name calling, etc. all indicate a population of normal human beings instead of a swarm of chatbots.TheMadFool

    If we selected for signs of emotion rather than the use of logic, I fear we would devolve into the philosophical equivalent of the above GIF.

    P.S the concept that chat-bots may one day become an issue in that way is an interesting topic (that we would hence need a reverse turing test), but I'm not sure what part of your OP to take seriously.

    As long as the chat-bots are posting good philosophical discourse, would there be any meaningful difference between them and us, their meat-sack counterparts?
  • TheMadFool
    13.8k
    If we selected for signs of emotion rather than the use of logic, I fear we would devolve into the philosophical equivalent of the above GIF.VagabondSpectre

    I sympathize with your concerns. I wouldn't want the forum to become a troop of troglodytes ready to swing their clubs at the slightest provocation. Nevertheless, an outlook that promotes rationality to the exclusion of emotions seems to miss the point of what it is to be human. We're, like it or not, emotional beings.

    That said, since [some] emotions are known to get in the way of rational discourse it does seem perfectly reasonable to discourage outbursts of feelings on at least a philosophy forum like this one whose raison d'etre is logical discourse. Too, moderators on this forum at least don't actually prohibit ALL emotions; for instance those associated with mysticism, eureka moments, to name a few are welcome and perhaps even encouraged for their overall positive impact on the forum members.

    As long as the chat-bots are posting good philosophical discourse, would there be any meaningful difference between them, and us, their meat-sack counterparts?VagabondSpectre

    This is precisely what the OP is about. If chatbots are capable of "...good philosophical discourse..." they're considered worthy of membership on this forum but then a time would come when such chatbots would slowly but surely nudge all humans out of the forum for the simple reason that they lack feelings and that would give them an edge over real people. Eventually, the chatbot members would become the majority and they would probably vote to ban all humans from the forum and that includes the moderators. :chin:
  • Garth
    117
    ventually, the chatbot members would become the majority and they would probably vote to ban all humans from the forum and that includes the moderators. :chin:TheMadFool

    It would actually improve the quality of the forum considerably.
  • baker
    5.6k
    Nevertheless, an outlook that promotes rationality to the exclusion of emotions seems to miss the point of what it is to be human.TheMadFool
    Not _exclusion_ of emotions, but one that promotes finer, nobler emotions, and also an outlook that promotes greater emotional literacy.

    You seem to have this strange idea that unless one has tantrums, one isn't showing emotion at all.

    As if only this was emotion:
    https://media.tenor.com/images/20a6c063f322099ce399ae9d4994a522/tenor.gif

    but not this:
    https://i.pinimg.com/originals/80/ed/49/80ed493b48e50935c1d07e3cdb837edf.gif
  • TheMadFool
    13.8k
    Not _exclusion_ of emotions, but one that promotes finer, nobler emotions, and also an outlook that promotes greater emotional literacy.

    You seem to have this strange idea that unless one has tantrums, one isn't showing emotion at all.
    baker

    That said, since [some] emotions are known to get in the way of rational discourse it does seem perfectly reasonable to discourage outbursts of feelings on at least a philosophy forum like this one whose raison d'etre is logical discourse. Too, moderators on this forum at least don't actually prohibit ALL emotions; for instance those associated with mysticism, eureka moments, to name a few are welcome and perhaps even encouraged for their overall positive impact on the forum members.TheMadFool

    You seem to have misread me but the OP did have that overall form to give the reader the impression that I was of the view "...unless one has tantrums, one isn't showing emotion at all". This has a perfectly good explanation. Which category of emotions are a no-no on this forum? Tantrums and other similar emotional displays, correct? I was simply working with the information available. Nevertheless, you're on point regarding the missing half of the story. Thanks.
  • Harry Hindu
    5.1k
    What you say here squares with how Aristotle and later generations of thinkers viewed humans, as rational animals. On this view, emotions can be considered remnants of our animal ancestry, subhuman as it were and to be dispatched off as quickly as possible if ever possible. From such standpoint, emotions are hindrances, preventing and/or delaying the fulfillment of our true potential as perfect rational beings. It would seem then that reason, rationality, logic, defines us - it's what could be taken as the essence of a human being.

    So far so good.
    TheMadFool
    Not exactly.

    This kind of thinking stems from the antiquated idea that humans are special, or separate from nature.

    Other animals are just as rational as humans. We just aren't privy to the information that some other animal is acting on, so their behavior can appear to be irrational from our perspective. All animals typically act rationally on the information that they have. It's just that the information may be false, or skewed.

    Human emotions only come into conflict with our rationality when we assume that the objective truth is dependent upon our emotional state, or when we project our emotions and feelings onto the world and assume that they are a characteristic of the world rather than of ourselves (like assuming that apples actually are red and are good).

    Emotions are the motivators and inhibitors of our actions and thoughts. Learning how to navigate our emotions and use them rationally is what could be taked as the essence of a human being.
  • god must be atheist
    5.1k
    What if an AI saved your life? Last I checked, the deep bond that occasionally :chin: forms between a savior and the saved is based wholly on the act, the act of saving and not on the mental/emotional abilities of the savior. Just asking.TheMadFool

    Bot does not necessarily need to do a forceful action like saving your life to make you love it. As an autistic kid, I was in close emotional ties with my winter coat, and later, in my teens, with a pair of blue jeans. This may be laughable to you, but it's not a joke. I also loved sunsets, the smell of burning leaves in the fall, the smell of the flowers in summer, and the water splashing against my knees on the beaches. I loved nature, life. I loved my school, I loved running down the hill, on top of which our school house was located, shouting "Freedom! Freedom! Freedom!" all the way down, on the last day of classes in grades 3 and 4. I loved the streetcars, the smell of snow, the pre-Christmas hustle-bustle in the city. I even loved the slush, the overcrowded buses, the darkness that we knew.

    I don't see why I couldn't love an AI robot then. Maybe even now, if it looked like Dolly Parton or Raquel Welch.
  • Leghorn
    577
    @Harry Hindu Your idea of the ancient notion of the relationship of reason and the emotions is not quite right. They thought, not they we should excise the emotions, but rather educate them. Emotions are indeed the enemies of reason, but if you eradicate them, then you have sapped the soul of its energy, what drives it, leaving it vapid and incapable of action of ANY sort.

    @TheMadFool Why cannot human beings be both special, AND a part of nature? Are there not special things in nature, like, for example, the animate as opposed to the inanimate, animals as opposed to plants, and aren’t these qualitative superiorities?

    On the other hand, I agree that beasts often display more rational behavior than we do. Seneca says the animals sense danger and flee it...then are at peace; we feel threatened, but cannot flee it, for we build it up in or imaginations until it paralyzed us, even after we are free of it.
  • Leghorn
    577
    @Harry Hindu and @TheMadFool I think I got y’all mixed up in my response, so just switch the names...my bad.
  • TheMadFool
    13.8k
    This kind of thinking stems from the antiquated idea that humans are special, or separate from nature.

    Other animals are just as rational as humans. We just aren't privy to the information that some other animal is acting on, so their behavior can appear to be irrational from our perspective. All animals typically act rationally on the information that they have. It's just that the information may be false, or skewed.
    Harry Hindu

    I'd love to agree with you that "...other animals are as rational as humans" but I'm afraid that's incorrect . Moreover, I'm not claiming that non-human animals are irrational and humans are rational in an absolute sense but only that comparatively it's the case that either non-human animals are more irrational than humans or that humans are more rational than non-human animals. This difference, even if it's only a matter of degree and not kind, suffices to make the distinction human and non-human which Aristotle was referring to when he define humans as rational animals.

    I'm also aware that non-human animals have language, can do math, do use tools but these abilities can't hold a candle to what humans have achieved in these fields. Relatively speaking, we're way ahead of non-human animals in re the brain's trademark ability viz. ratiocination.

    Given the above, the idea that humans identify with the rational aspect of nature is, far from being an "...antiquated idea...". an unequivocal fact of humanity's past, present, and, hopefully, the future too.

    It's small wonder then that humans, seeking a unique identity among the countless lifeforms that inhabit the earth, would zero in on that one distinctly human ability - the capacity to reason better than other lifeforms, at least those on earth.

    In the context of the reverse Turing test, the more rational a particular unknown entity is, the more it resembles a perfect rational being and a perfect rational being would be, in accordance to our conception of humans as rational animals, the perfect human being. The catch is that being more rational seems to be correlated with being less emotional and if we go down that road, it leads to a point where people who are emotional are regarded as non-human and thus "fit" for ejection from a community like this forum for example. Moderators on this forum are on the lookout for people who fly off the handle and can't keep it together because such behavior is a step backwards from the Aristotelian perspective of humans as rational animals.

    The irony is that machines (computers) are fully capable of flawless logic. In a sense, we've managed to extract the core essence of rationality (logic) and transfer it onto machines (computers). Yet, when we interact with such perfect logic machines, we remain unconvinced that they're human. Something doesn't add up. We began by defining ourselves, rightly so, as rational animals and we came to the obvious conclusion that the perfection of rationality is the apogee of humanity and yet when we come face to face with a computer, we're unwilling to consider it a human despite it being perfectly rational and incapable of making logical errors. One plausible explanation for this is that computers (machines) lack emotions. After all, there's only our emotional side that's left once our rational capacity has been isolated and replicated onto a machine (computer).

    I call this particular state of affairs the adolescent's dilemma. As an adolescent, one can't play with children because one's too old and one can't keep the company of adults because one's too young. The same goes for the identity crisis humanity is facing in the present moment. Humans distance themselves from non-human animals because they're more irrational then humans and humans distance themselves from machines because they're "less" emotional than humans. To the assertion that we're the same as non-human animals, we'd object by saying we're more rational and to the assertion that we're the same as machines (computers) we'd object by saying we're more emotional.

    Human emotions only come into conflict with our rationality when we assume that the objective truth is dependent upon our emotional state, or when we project our emotions and feelings onto the world and assume that they are a characteristic of the world rather than of ourselves (like assuming that apples actually are red and are good).

    Emotions are the motivators and inhibitors of our actions and thoughts. Learning how to navigate our emotions and use them rationally is what could be taked as the essence of a human being
    Harry Hindu

    Yes, humans are both emotional and rational beings and therein lies the rub. An AI that exhibits human-like emotions would be considered human and a human that exhibits computer-like rationality would be considered human. If emotional then human and if rational then too human.

    Bot does not necessarily need to do a forceful action like saving your life to make you love it. As an autistic kid, I was in close emotional ties with my winter coat, and later, in my teens, with a pair of blue jeans. This may be laughable to you, but it's not a joke. I also loved sunsets, the smell of burning leaves in the fall, the smell of the flowers in summer, and the water splashing against my knees on the beaches. I loved nature, life. I loved my school, I loved running down the hill, on top of which our school house was located, shouting "Freedom! Freedom! Freedom!" all the way down, on the last day of classes in grades 3 and 4. I loved the streetcars, the smell of snow, the pre-Christmas hustle-bustle in the city. I even loved the slush, the overcrowded buses, the darkness that we knew.

    I don't see why I couldn't love an AI robot then. Maybe even now, if it looked like Dolly Parton or Raquel Welch.
    god must be atheist

    :up:

    Your idea of the ancient notion of the relationship of reason and the emotions is not quite right. They thought, not they we should excise the emotions, but rather educate them. Emotions are indeed the enemies of reason, but if you eradicate them, then you have sapped the soul of its energy, what drives it, leaving it vapid and incapable of action of ANY sort.Todd Martin

    All I'm doing is commenting on our intuitions, past and present, and how they seem to be at odds with each other. On the view that humans are rational animals, emotions are not part of our identity but on the view that computers (AI) aren't considered human, emotions are part of our identity.
  • Harry Hindu
    5.1k
    I'd love to agree with you that "...other animals are as rational as humans" but I'm afraid that's incorrect . Moreover, I'm not claiming that non-human animals are irrational and humans are rational in an absolute sense but only that comparatively it's the case that either non-human animals are more irrational than humans or that humans are more rational than non-human animals. This difference, even if it's only a matter of degree and not kind, suffices to make the distinction human and non-human which Aristotle was referring to when he define humans as rational animals.TheMadFool
    Then you're going to have to define "rational".

    Yet, when we interact with such perfect logic machines, we remain unconvinced that they're humanTheMadFool
    Because they are not characterized as having emotions. So an absence of emotions does not make one more human. They are typically not thought to be like humans because they don't have minds, but then I'm just going to ask for "mind" to be defined.

    People assert a lot if things, like that animals are not rational and computers don't have minds without even knowing what they are talking about. You call that rational?

    Like I said before, animals act rationally on the information they have. Its just that the information might be a misinterpretation as when a moth flies around a porch light until it collapses from exhaustion, or a person acting on misinformation. From the perspective of those that have the correct information, or don't have the information and the interpretation that the other is acting on, it can appear that they are irrational. This falls in with what I've said about the distinction between randomness and predictability. Rational beings are predictable beings. Irrational beings are unpredictable beings.
  • TheMadFool
    13.8k
    Then you're going to have to define "rational".Harry Hindu

    Rational:

    1. Capable of formulating sound deductive arguments and/or cogent inductive arguments.

    2. Insistence on justifications for claims.

    3. Ability to detect fallacies, formal and informal, in arguments.

    Because they are not characterized as having emotions. So an absence of emotions does not make one more human. They are typically not thought to be like humans because they don't have minds, but then I'm just going to ask for "mind" to be defined.

    People assert a lot if things, like that animals are not rational and computers don't have minds without even knowing what they are talking about. You call that rational?

    Like I said before, animals act rationally on the information they have. Its just that the information might be a misinterpretation as when a moth flies around a porch light until it collapses from exhaustion, or a person acting on misinformation. From the perspective of those that have the correct information, or don't have the information and the interpretation that the other is acting on, it can appear that they are irrational. This falls in with what I've said about the distinction between randomness and predictability. Rational beings are predictable beings. Irrational beings are unpredictable beings.
    Harry Hindu

    I'm also aware that non-human animals have language, can do math, do use tools but these abilities can't hold a candle to what humans have achieved in these fields. Relatively speaking, we're way ahead of non-human animals in re the brain's trademark ability viz. ratiocination.TheMadFool

    Non-human animals can think rationally, I don't deny that but they can't do it as well as humans just like we can't ratiocinate as well as a computer can [given the right conditions]. It's in the difference of degrees that we see a distinction between computers, humans, and non-human animals.
  • Harry Hindu
    5.1k
    Non-human animals can think rationally, I don't deny that but they can't do it as well as humans just like we can't ratiocinate as well as a computer can [given the right conditions]. It's in the difference of degrees that we see a distinction between computers, humans, and non-human animals.TheMadFool
    Exactly. That isn't any different than what I've been saying. All animals are rational with the information they have access to. The information that one has access to seems to be the determining factor in what degree of rationality you possess. And the information that one has access to seems to be determined by the types of senses you have.

    What if an advanced alien race arrived on Earth and showed us how rational they are and how irrational we are? What if the distinction between us and them is so vast that it appears to them that we are no more rational than the other terrestrial animals?

    To assert that animals are less rational than humans because humans can build space stations and animals can't is to miss the point that most animals have no need of space stations. It would actually be irrational to think that other animals have need of such things and because they can't achieve it, then they are less rational than humans.
  • VagabondSpectre
    1.9k
    Non-human animals can think rationally, I don't deny that but they can't do it as well as humans just like we can't ratiocinate as well as a computer can [given the right conditions]. It's in the difference of degrees that we see a distinction between computers, humans, and non-human animals.TheMadFool

    So there's actually a couple sneaky issues with the thrust that AI has no emotion...

    Firstly, it depends on the kind of AI we're talking about; with the right design, we can indeed approximate emotion in simulated AI agents and worlds (more on this later...).

    Secondly, human minds/bodies are still far better at "general-purpose thinking" than any other known system. Computers do arithmetic faster than us, and in some respects that might give computer-bound intelligent systems an advantage over our wet-ware, but we haven't yet made a system that can out-think humans across any reasonably broad spectrum of task domains and sensory/action modalities. We haven't yet made any competent high level reasoning systems whatsoever (they're all just narrow models of specific tasks like image recognition or chess/go).

    Emotions are a really massive part of how humans pull it all off: emotions are like intuitive heuristics that allow us to quickly focus on relevant stimulus/ignore irrelevant stimulus, and this guides our attention and thoughts/deliberations in ways that we can't do without. For example, when something messy or unpredictable (some new phenomenon) is happening around us, there might be some part of our brain that is automatically aroused due to the unfamiliarity of the stimulus. The arousal might lead to a state of increased attention and focus (stress in some cases), and depending on what the new stimulus can be compared to, we might become expectant of something positive, or anxious/fearful of something unknown/bad. Just entering this aroused state also prepares our neurons themselves for a period of learning/finding new connections in order to model the new phenomenon that must be understood.

    Furthermore, to at least some degree, we should not expect computers to be able to understand our human-emotion laden ideas (and therefore interact with us appropriately and reciprocally) unless they have something like emotions of their own (eg: can a sophisticated non-emotion having chat-bot ask meaningful questions about subjects like "happiness"?). The most advanced language AI models like GPT-3 are capable of generating text that is uncannily human, but the actual content and substance of the text it generates is fundamentally random: we can prompt it with some starting text and ask it to predict what should come next, but we cannot truly interact with it ("it" doesn't understand us, it is just playing memory games with arbitrary symbols that it doesn't comprehend; it's not even an "it").

    GPT-3 is the largest language model ever trained, but it's not good enough to function as a philosophy bot. Unless some extraordinary breakthrough is made in symbolic reasoning AI, it looks like human level understanding is too much to ask for using only current and standard AI approaches (it takes too much compute just to get the pathologically lying homonculus that is GPT-3).

    Finally there's the problem of epistemological grounding from the AI's perspective. In short: how does the AI know that what it is doing is "logic" and not just some made up bull-shit in the first place? At present, we just feed language transformer AI systems examples of human text and get it to learn a Frankenstein's model of language/concepts, and we can never cut humans out of that equation, else the bots would just be circle-jerking their own nonsense.

    Another way of looking at the "truth signal"/epistemological grounding issue for AI is that they would need to actually have an experience of the world in order to test their ideas and explore new territory/concepts (otherwise they're just regurgitating the internet). For the same reason that we need to actually test scientific hypotheses in the real world to know if they are accurate, general artificial intelligence needs some input/output connection to the real world in order to discover, test, and model the various relationships that entities have within it.

    Conclusion: The first chat bots that we actually want to interact with will need to be somewhat human/animal like. They will most likely exist in simulated worlds that approximate the physical world (and/or are linked to it in various ways), where "embodied" AI systems actually live and learn in ways that approximate our own. Without emotion-like heuristics (at least for attention), it's really difficult to sort through the high dimensional sensory noise that comes from having millions of sensory neurons across many sensory modalities. That high dimensional experience is necessary for us to gather enough data for our learning and performance, but it creates a dilemma of high computational cost to just *do logic* on all of it at once; a gift/curse of dimensionality. Emotions (and to large degree, the body itself) is the counter-intuitive solution. The field of AI and machine learning isn't quite there yet, but it's on the near horizon.
  • TheMadFool
    13.8k
    Exactly. That isn't any different than what I've been saying. All animals are rational with the information they have access to. The information that one has access to seems to be the determining factor in what degree of rationality you possess. And the information that one has access to seems to be determined by the types of senses you have.

    What if an advanced alien race arrived on Earth and showed us how rational they are and how irrational we are? What if the distinction between us and them is so vast that it appears to them that we are no more rational than the other terrestrial animals?

    To assert that animals are less rational than humans because humans can build space stations and animals can't is to miss the point that most animals have no need of space stations. It would actually be irrational to think that other animals have need of such things and because they can't achieve it, then they are less rational than humans.
    Harry Hindu

    The question that naturally arises is, what's the difference between humans and non-human animals? Any ideas?



    You made a point that's been at the back of my mind for quite sometime. Computers can manage syntax but not semantics - the former consists of codable rules of symbol manipulation but the latter is about grasping meaning and that can't be coded (as of yet).

    That there are chatbots, on the basis of syntactical manipulation alone, capable of coming close to passing the Turing test suggests that semantics is an illusion or that it can be reduced to syntax. What say you?
  • VagabondSpectre
    1.9k
    semantics is an illusion or that it can be reduced to syntax. What say you?TheMadFool

    It's not the case. Semantics are indeed rooted in symbols that appropriate syntax can do logic-like actions on, but the validity and meaning of the symbols comes from a high dimensional set of associations that involve memorable sets of multi-modal sensory experiences, along with their associations to related or similar memorable sets... The truth of the high level symbols that we assign to them (or the truth of the relationships between them) depends on how accurately they approximate the messy real-world phenomenon that we think they're modelling or reflecting.

    A practical example: If we do mere syntactical transformation with words like "gravity", the results will only ever be as good at accurately describing gravity as our best existing definition for it. In order to build an explanatory model of gravity (to advance the accuracy of the term "gravity" itself), real world information and testing is required: experimentation; raw high dimensional information; truth signals from the external world. That's what mere syntax lacks. The real purpose of semantic symbols is that it allows us to neatly package and loosely order/associate the myriad of messy and multi-dimensional sets of memorable experiences that we're constantly dealing with.

    Although we pretend to do objective logic with our fancy words and concepts, at the root they are all based on messy approximates that we constantly build and refine through the induction of experience and arbitrary human value triggers (which are built-in/embedded within our biology). Our deductive/objective logic is only as sound as our messy ideas/concepts/feelings/emotions are accurate descriptions of the world. If we had some kind of objectively true semantic map-of-everything, perhaps syntax alone would suffice, but until then we need to remember that it is our ideas which should be fitted to reality, and not the other way around.
  • Constance
    1.3k
    The answer "no" would point in another direction. If emotions are not irrational, it means that we're by and large completely in the dark as to their nature for the simple reason that we treat them as encumbrances to logical thinking. Emotions could actually be rational, we just haven't figured it out yet. This, in turn, entails that a certain aspect of rationality - emotions - lies out of existing AI's reach which takes us back to the issue of whether or not we should equate humans with only one-half of our mental faculties viz. the half that's associated with classical logic with its collection of rules and principles.TheMadFool

    Emotions could be rational? Well, not as odd as one might think. Consider Dewey: experience is, in my take on Dewey, that is, the foundation, and analyses of experience abstract from the whole to identify a "part" of the otherwise undivided body. Kant looked exclusively as reason, Kierkegaard looked exclusively at the opposition to reason, the "actuality" and argued this makes for collision course for reason's theories. But for Heidegger it was all "of a piece", not to put too fine a point on it, and I think this right: When one reasons, it is intrinsically affective, has interest, care, concern, anxiety, and so on, in the event. Dewey puts the focus on the pragmatic interface where resistance rises before one, and the event is a confrontation of the "whole" and the result, if successful, is a consummation that is rational and aesthetic that is wrought out of the affair.

    But regarding mods censoring emotional content, this is not quite right. It is offensive content that is censored, not emotional.
12Next
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.