I described a trolley problem where 5 lives can be saved by taking action that kills one. What’s does your consistent moral code say about this? Why isn’t it done today? Why is it more moral to let the 5 die, and should this standard be changed? — noAxioms
All else being equal then. In the organ thing, everybody is around 40 year old and part of a family and is loved. The 5 will die within 3 months without the procedure. They would be expected to live full productive lives with the surgery, but of course at the cost of the one, also loved, etc.We may well apply morality as a pure numbers game, when there is no other information available. — universeness
So it is a numbers game, but only when its a game and only if you're not personally involved.In the absence of such detail, the morally consistent approach for me, is that if we are talking about a train track lever that switches the trolly from one track to another, then I would probably pull the leaver and let 1 die rather than 5, if I know nothing about the people involved.
Ah, children are worth more than adults. Interesting. The old Titanic thing. I wonder where they cut off the age limit for 'women and children'.If the 1 was my child/wife etc, it's probably then going to be bye bye 5, unless it was 5 children.
Sure, but what if the making of the choice was done by another or was automated? Remember, I'm asking what's right, not what you would do, although knowing what you would do is certainly also interesting. We discussed automating unpleasant tasks above. This certainly qualifies as one.In any such situation, of choosing what you consider horrific outcome 1 and horrific outcome 2, but you do have some personal moral notion of a lesser evil between the two choices, then you make your choice, but you will probably never recover from the experience.
This is in direct contradiction to your comment above where you perhaps suggest saving the five outweighs the one. How is this not exactly the trolley problem? I assure you it comes up in real life, and human morality actually says kill the 5, not the 1. Why is this?I would never advocate for harvesting the organs of 1 to save many, like you suggested, no.
War has always been about sacrifice of people here and there for a greater goal. It is unavoidable. If you could not have lived with yourself after making the decisions you mention, then you (and I both) are not fit for leadership.There are many obvious examples of these moral dilemma's from history. Here are two. — universeness
How do you envision that these automated systems would have chosen better? No matter what, they still have to throw lives against the lives of the enemy and it is partly a numbers game. Would they have chosen differently?An automated system at the level of an AGI or ASI would hopefully prevent such scenario's from happening in the first place or be better able to create alternatives to binary choices between horrific choice 1 or horrific choice 2.
Such words are easily typed but such a horrific situation, might mean you have to sacrifice your own family, as well as many other innocents, to stop a horror like fascism from taking over.War has always been about sacrifice of people here and there for a greater goal. It is unavoidable. — noAxioms
Thank goodness that we stopped him then. He was a butcher and a man who would be King, if he could.Churchill did, but he didn't have the support needed, including from you apparently. — noAxioms
I think they would reject all notions of war and would not allow such, as they would not be infected with the same primal fears/paranoia/territoriality/tribalism that humans have to combat.How do you envision that these automated systems would have chosen better? No matter what, they still have to throw lives against the lives of the enemy and it is partly a numbers game. Would they have chosen differently? — noAxioms
Anesthetic, can remove all feeling from your body and you can remain awake. How is this possible if any aspect of consciousness or mind, exists outside of the brain? My brother-in-law, had a triple bypass operation, and he was awake all the way through the operation and asked to see his opened body and exposed heart, during the operation, this request was fulfilled. Why did Stephen Hawking continue with his life considering the lack of function/feeling he had in his body? Do you think he was less conscious or had less access to 'mind' due to the reduced state of his body? Why do people paralised from the neck down, still want to live? Christopher Reeves of superman fame for example? — universeness
Emotions are how individuals deal with matters or situations they find personally significant. Emotional experiences have three components: a subjective experience, a physiological response and a behavioral or expressive response.
https://online.uwa.edu/news/emotional-psychology/#:~:text=Physiological%20Responses,-We%20all%20know&text=This%20physiological%20response%20is%20the,fight%2Dor%2Dflight%20response. — Psychology and Counseling News
Not sure what you are referring to here Athena, a particular sci-fi movie perhaps? — universeness
If your brother-in-law was awake during surgery he had a regional anesthetic, not a general anesthesia
that makes a person unconscious. His brain was still working, right? — Athena
AI can not have an emotionally feeling body. — Athena
Stephen Hawking had ALS and so did my mother. — Athena
My comment was just reaching for real-life non-war scenarios that demonstrated the trolley paradox. I've come across many. I hope you personally never have to face one, either being the one or being part of the five, but it happens.Such words are easily typed but might mean you have to sacrifice your own family, as well as many other innocents, to stop a horror like fascism from taking over. I hope you never personally face such horror's in your life. — universeness
Hitler is taking over Europe, including GB in short order. The automated system would reject that and just let it happen rather than resist? That route was encouraged by several notable figures at the time, I admit.How do you envision that these automated systems would have chosen better?
— noAxioms
I think they would reject all notions of war and would not allow such
I too give my sympathies, for your mother and for any caregivers, a heroic task similar to caring for an Alzheimer's patient. I have a cousin-in-law that is in final stages of ALS, in hospice now.Stephen Hawking had ALS and so did my mother. — Athena
Hitler is taking over Europe, including GB in short order. The automated system would reject that and just let it happen rather than resist? That route was encouraged by several notable figures at the time, I admit. — noAxioms
It is called a local anesthetic here, not a regional anesthetic. — universeness
Local anesthesia. This is the type of anesthesia least likely to cause side effects, and any side effects that do occur are usually minor. Also called local anesthetic, this is usually a one-time injection of a medication that numbs just a small part of your body where you’re having a procedure such as a skin biopsy.
Regional anesthesia is a type of pain management for surgery that numbs a large part of the body, such as from the waist down. The medication is delivered through an injection or small tube called a catheter and is used when a simple injection of local anesthetic is not enough, and when it’s better for the patient to be awake. — American Society of Anesthesia
All physiological responses are controlled, enacted and terminated via the brain, imo. — universeness
If you want us to believe you know it all, you should read the links before making your arguments.
Local anesthesia. This is the type of anesthesia least likely to cause side effects, and any side effects that do occur are usually minor. Also called local anesthetic, this is usually a one-time injection of a medication that numbs just a small part of your body where you’re having a procedure such as a skin biopsy.
Regional anesthesia is a type of pain management for surgery that numbs a large part of the body, such as from the waist down. The medication is delivered through an injection or small tube called a catheter and is used when a simple injection of local anesthetic is not enough, and when it’s better for the patient to be awake.
— American Society of Anesthesia — Athena
Put the gun down Athena! Remove yourself from the room or suggest the person leaves until you both calm down, or is this situation not as bad as I suggest?Music is good for producing desired feelings and calming us down when we are in fight or flight mode, as I am now because of a communication problem with someone in the room with me. — Athena
A relative? A politician on the TV? @Jamal?My intense anger may be the result of hormones started in my head but I assure you they are in my body, not my head and I should probably go for a walk to metabolize these fight of flight hormones faster. Yipes he is not shutting up- I am going for a walk. — Athena
Put the gun down Athena! Remove yourself from the room or suggest the person leaves until you both calm down, of is this situation not as bad as I suggest? — universeness
OK, I thought you were suggesting that AI would have avoided war with Hitler given the same circumstances. You are instead proposing that the entire world has already been conquered and the AGI would keep it that way. So more or less the same question, how would the AGI prevent a rise to power of a rival better than if a human was the main power of the entire Earth? I accept that choices motivated by personal gain (corruption) is more likely than with the AGI since it isn't entirely clear what it would consider to be personal gain other than the assured continuation of its hold on power.I was suggesting that if an ASI was the main power on Earth, then the rise to power, of a character like Hitler or even Trump, would not be allowed to occur. — universeness
I too give my sympathies, for your mother and for any caregivers, a heroic task similar to caring for an Alzheimer's patient. I have a cousin-in-law that is in final stages of ALS, in hospice now. — noAxioms
Emulating the human brain processes that cause emotions/sensations/feelings in the human body is POSSIBLE in my opinion but I fully accept that we are still far away from being able to replace your pinky, with a replicant which can equal it's functionality and it's actions as a touch sensor. — universeness
Not current AI no. Do you reject the idea of a merging of the human brain with a future cybernetic body (cyborgs) or a cloned body or some combination of tech/mecha and orga?AI will not have this problem because it does not have a body and hormones and therefore the ability to experience life — Athena
OK, I thought you were suggesting that AI would have avoided war with Hitler given the same circumstances. You are instead proposing that the entire world has already been conquered and the AGI would keep it that way. — noAxioms
Not current AI no. Do you reject the idea of a merging of the human brain with a future cybernetic body (cyborgs) or a cloned body or some combination of tech/mecha and orga?
I dont understand why you think any process/sensation/feeling that you have ever experienced in your body an interpreted in your mind, CANNOT EVER be reproduced by scientific efforts. — universeness
I am talking with a man who has right frontal brain damage, and getting angry with him because he does not understand what I am saying. That is pretty stupid. For several years I worked with a mildly retarded guy who never got upset when someone didn't understand something as simple as sweeping the floor. He could relate to not understanding and would help the person understand. While I am instantly screaming at someone for being an idiot. Who is the idiot? It is not easy being human and really, I don't understand why it is so hard but my emotions make me behave will an idiot even when I know better. So what is up with these emotions? — Athena
Yes, it is the same guy. I want to talk about that in the thread for that subject but not this thread. — Athena
We have some serious problems and need to stop here for a while and contemplate what we are doing and where we want to go with this. — Athena
But I also have a spiritual concern as well. It goes with wanting to preserve the organic earth and valuing humans. I think valuing AI more than we value humans, and nature, can be a path into the darkness. I want to be very clear about this. I am concerned about how much we value humans. — Athena
Do you reject the idea of a merging of the human brain with a future cybernetic body (cyborgs) or a cloned body or some combination of tech/mecha and orga?
I dont understand why you think any process/sensation/feeling that you have ever experienced in your body an interpreted in your mind, CANNOT EVER be reproduced by scientific efforts. — universeness
I don't understand why you think any process/sensation/feeling that you have ever experienced in your body and interpreted by your mind, CANNOT EVER be reproduced by scientific efforts. — universeness
Do you reject the idea of a merging of the human brain with a future cybernetic body (cyborgs) or a cloned body or some combination of tech/mecha and orga?
I dont understand why you think any process/sensation/feeling that you have ever experienced in your body an interpreted in your mind, CANNOT EVER be reproduced by scientific efforts. — universeness
Are you aware of this lecture by Rupert Sheldrake (released to YouTube 2 months ago,) regarding his theory of morphic resonance and morphic fields? It's 2.5 hours long but worth the watch. I knew about his work but I found this lecture on how an aspect of 'mind' might reach beyond the restriction of brain and body, quite interesting. — universeness
Seriously? — Athena
Yes, I do not think merging the human brain with a future cybernetic body is a good idea. Our brains are limited and I think we need to understand the limits and stay within them. There are concerns about what could happen to our brains and also what could happen to AI. — Athena
Well, the alternative seems to be every world leader voluntarily ceding power to a non-human entity. I'm sure none of them will have a problem with that. Imagine an AGI (seems totally benevolent!) created by the Russians and the UK is required to yield all power to it. Will they?Conquest is not the only wat to achieve unison! — universeness
This is a human conclusion. The AGI might well decide that, being superior, it is the better thing to give the universe purpose. I of course don't buy that because I don't think the universe can have a purpose, but assuming it can, how would the AGI not be the better thing to preserve, or at least to create it successor, and so on.I envisage an AGI/ASI ... would protect sentient life against threats to it's continued existence, as it would have a very real and deep understanding of how purposeless the universe is, without such lifeforms.
To suggest a purpose to the universe is to suggest it was designed. I cannot think of a purposeful thing that isn't designed, even if not intelligently designed.That is either a very arrogant assumption on my part, or it's a truth about our existence in the universe.
Things usually work out in fiction because they have writers who make sure the good guys prevail.Do you remember this star trek episode? Perhaps the 'organians' are like a future ASI.
The organians or a future ASI, would have many ways to stop pathological narcissistic sociopaths like Hitler, or even relative failures like Trump. Perhaps they could even treat their illness.
Not at all! as the 'purpose,' I am suggesting, only exists, as an emergence of all the activity of that which is alive, and can demonstrate intent and purpose, taken as a totality.To suggest a purpose to the universe is to suggest it was designed. I cannot think of a purposeful thing that isn't designed, even if not intelligently designed. — noAxioms
All production would be controlled by the ASI in a future, where all production is automated.You never answered how an AGI might have prevented war with Hitler. I admit that intervention long before they started their expansion would have prevented the whole world war, but what kind of intervention if something like war is off the table? Preventing them from building up a military in the first place seems like a good idea in hindsight, but it certainly didn't seem the course of action at the time. The AGI says, hey, don't do that, and Hitler doesn't even bother to respond. Sanctions, etc, ensue, but that didn't work with Russia either. — noAxioms
Imagine an AGI (seems totally benevolent!) created by the Russians and the UK is required to yield all power to it. Will they? — noAxioms
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.