Here is an example of a piece of logic.
Premise 1: I get letters if and only if the postman visited
Premise 2: I got letters today
Conclusion: The postman visited
If the two premises are true, then logically the conclusions must be true as we understand it.
Now what about if contradictory things can happen? "I get letters if and only if the postman visited" and "I got letter without the postman visiting" can both be true in such a world. The two premises no longer logically result that the conclusion must be true. — PhilosophyRunner
Then that we see donkeys are grey and donkeys are small can both happen at the same time, does not mean they are non-contradictory. I.e if a being exists that can make contradictory things happen, then the very basis of logic that we use everyday, would be suspect. — PhilosophyRunner
I don't think you can make a logical argument against an entity that can make two contradictory thing possible at the same time, as such a being would be able to invalidate the most perfect piece of human logic. — PhilosophyRunner
Could god not be both a person and the universe simultaneously? — Benj96
you cannot have justice and good without its opposite — Benj96
But sadly in this duality god as the universe is ambivalent - because the system contains both good and evil, both chaos/destruction and order/creation - you cannot have justice and good without its opposite, and you cannot have free will either if only one or the other existed in isolation. — Benj96
the truth doesn't change. It's the truth after all. And such a fundamental constant/ law/ principle as truth - which is unchanging.. Must therefore be inaccesible to systems that change/are under the influence of change. — Benj96
And because the truth cannot change fundamentally or it wouldnt be true - it must have something to do with energy and time (the ability to do work/cause change as well as quantum uncertainty - heisenbergs uncertainty principle in science) and perhaps (in spirituality/ religion - being, consciousness, ethics and god). — Benj96
We have free will and it is our responsibility to create just outcomes in society. — Hallucinogen
Expecting God to do everything for us so we needn't do anything is lumping the means by which we show God who we are onto God's lap, which would be pointless because God created creation to see how we react to life. — Hallucinogen
You're going to have to define God in your syllogism as that which eliminates the possibility of injustice. I'm not sure that is a generally accepted notion of God. Most religions accept that there is injustice. — Hanover
Have we taken up too much time on this? — Tom Storm
Is this an insult? We are exploring an argument, not trying to slight each other, right?
We disagree (partly) in a discussion forum - nothing wrong with that, right? — Tom Storm
I'll concede one thing here - you're right to say God may not be just by a human understanding of what is just. — Tom Storm
My problem is not this part of the argument, rather the implication that god is in some way a moral monster or 'choosing not to intervene'. — Tom Storm
From the perspective of omniscience what humans understand as injustice might look to be something utterly different. God may not consider intervention to be appropriate. — Tom Storm
Just in saying that demonstrates to me you don't understand the argument.
Do you want to keep going in circles or have we reached the end for now? — Tom Storm
You've probably missed the argument about the nature of god then. You're approaching this in human terms and thinking of god as a kind of very special human, with the same frame of reference. — Tom Storm
Mainly just for the kinds of anthropomorphic, cartoon gods of evangelicals.
I'm not sure how you have determined god's state of mind to conclude it doesn't give a fuck. :smile: — Tom Storm
I get that, but I think this narrows the scope and nature of both god and evil. That's all I am saying. The world may be much vaster than this small fence around matters moral and metaphysical would suggest. — Tom Storm
The problem for me is that these kinds of formulations only really work if God is a person - some old guy in the sky, with a personality and an almost human approach and is subject to a literalist/fundamentalist interpretation. — Tom Storm
b) We have no good evidence about the nature of any god/s. — Tom Storm
a) It is unwise to reach conclusions in the absence of good evidence. — Tom Storm
c) Therefore we can make no claims about god/s as being just or unjust. — Tom Storm
a - If god exists we seem to have no demonstrable way of knowing what their nature is, or if god is even present in the physical world.
I guess I would ask, what exactly is the correlation between our world and the reality (or not) of a deity? — Tom Storm
So if there is such a good God and they are able to be a person for a limited time and speak the truth now would be a good time for them to reveal themselves — Benj96
That's a terrible shame. I do agree that probably most people at this stage in time would have to "see it to believe it" rather than blindly trust that such a good god exists. — Benj96
What I am saying is that if you are starting off with an omniscient God (as you did mention in your OP), then by the very attribute you ascribed to God, God has a superior understanding of what is just than you. — PhilosophyRunner
I think no one good wants the second type of God to exist. — Benj96
Exactly. He or she would have to demonstrate it instead of just saying so/dictating. They would have to show everyone what it means to be ethical (good) or unethical (bad) by utilising themselves (the truth - if they are indeed omniscient). — Benj96
Are we referring to God as a person here or god as the universe?
Because god as a person could be just. They have free will to make good or bad decisions. God as the universe cannot be just as the universe is everything: thus including both justices and injustices as a whole. — Benj96
In that frame work, we would have to assume that your understanding of who is an asshole is wrong compared to God's superior understanding of who is an asshole. — PhilosophyRunner
Or that everything ethical or just is what god understands as ethical and just, regardless of whether humanity understands it as ethical or just. — PhilosophyRunner
But from my point of view, this is all moot as I see no reason to believe an omniscient God in the first place. — PhilosophyRunner
to play devils advocate (in a post about God!), if you take God to be omniscient then God has a better understanding of what is just than you do (as your knowledge is not perfect, God's is). And as such it make no sense for you to judge God's actions as unjust, this is merely your limited human mind not being able to comprehend true Godly justness. — PhilosophyRunner
If I find God to be unjust by my understanding of justness - this means my understanding of what is just and God's understanding of what is just differ. As God cannot be wrong in his understanding of anything, it is my understanding of justness that is wrong. — PhilosophyRunner
reading along with an occasional reply entering those discussions which I'm actually able to participate in without getting eaten alive :nerd: — Seeker
If the OP claims that rationality is a force that negates/limits free will, I'd have to agree but with the proviso that as per some sources it (rationality) also liberates in the sense that if a particular factor that influences our decisions is identified, we can take (logical) steps to counter it (effectively). — Agent Smith
I don't see that you have properly distinguished between rational and irrational. You seem to be saying that an irrational act follows from some kind of "internal logic", which is logic that may be faulty, and this is the means by which you can say that an irrational act is actually in some sense rational. — Metaphysician Undercover
So, the fault in (b) is that what you call a "rational action", may actually be irrational, because the internal logic may be faulty, yet the irrational act qualifies as a "rational action" by your definition. — Metaphysician Undercover
Then, in premise (c) you go way off track. The selection of a course of action, does not necessarily "preclude" all other possible courses of action. One may set out on a course of action, being somewhat unsure of oneself, and ready to change course at a moment's notice. — Metaphysician Undercover
Therefore "p" as the possible courses of action, in an irrational action, is completely backward in your representation. You represent the possible courses of action as having been considered by the acter, when in reality, the irrational acter does not consider those possible courses of action, hence the irrational act follows. — Metaphysician Undercover
It would probably help if you gave the definition of FW with which you're working here. — noAxioms
It seems to vary considerably depending on one's biases. I for instance define it as being able to make my own choices, and not having an external (supernatural?) entity do it for me. Pretty biased, I know. No, I'm not a materialist, but again, maybe you have a different definition of what being a materialist means. — noAxioms
My more typical example is one where somebody is trying to cross a busy street. There's more than one time to do it safely, but one must still choose a safe one over one that puts you in unreasonable danger. Some people's definition of free will would get this person killed almost every time. The rational robot should have no trouble with the task, because it has the sort of free will that I defined. — noAxioms
a. Humans are somewhat inherently rational and take some actions based upon reasoning and internal logic.
So we love to believe, but I've found it to be otherwise. It is actually a good thing that we're not particularly rational. — noAxioms
d. If actor x has free will, they can choose combinations of courses of action that are subsets of p that are not otherwise available to actor x even with the intent to act rationally.
A simple mechanical device can make such choices. Does such a device have free will then? — noAxioms
e. By necessity, all actions p + a that are considered with the intent to act rationally and those that are precluded by reasoning/faulty logic must be rational or action a is unfree depending upon whether or not free will exists.
Don't understand this. It seems to suggest that all possible actions considered must be rational ones. If one considers an irrational one, the choice eventually made (even of a different action) is not free. That makes no sense, so I probably got it wrong. — noAxioms
each's premises must be differentiated in terms of subsets of the collection of infallible premises q.
The premises are infallible now. Does that means they're necessarily true (which would defeat them being called premises at all), or they're not open to debate, in which case they're irrational biases instead of premises arrived at via rational choice. — noAxioms
To begin: when discussing “rational” actions, “rational” means in accordance with reason or logic, which are two very different things. A belief that results in an action can have internal logic but be the result of poor reasoning and still be rational according to some faulty premises. I will define rational as such:
An example of something that involves reasoning that is not logical would help clarify this. Maybe something else that is logical but lacks reasoning. — noAxioms
Rational: A reference to any belief that possesses internal logic and reasoning consistent with a set of premises that may or may not be accurate.
It's only about beliefs? Not choices? Must the logic be valid? Plenty of supposedly rational choices are made by poor logic skills, resulting in actions inconsistent with their premises. Reaching for the next cigarette for example, despite knowledge (premises) that doing so will ruin one's health. — noAxioms
Can't answer that since it seems to be dependent on a selected goal. Being human, I'm apparently too stupid to select a better goal. I'm intelligent enough to know that I should not be setting the goal.
But I can think of at least three higher goals, each of which has a very different code of what's 'right'. — noAxioms
Right. But we'll not like it because it will contradict the ethics that come from our short-sighted human goals. — noAxioms
DARPA actually is investigating Targeted Neuroplasticity Training for teaching marksmanship and such things.
That perhaps can improve skills. Can it fix stupid? I doubt the military has more benevolent goals than our hypothetical AI. — noAxioms
Human ethics are based on human stupidity. I’d not let ‘anything the humans want’ to be part of its programming. — noAxioms
Perhaps, but then they're also incredibly stupid, driven by short term goals seemingly designed for rapid demise of the species. So maybe the robots could do better. — noAxioms
Take away all the wars (everything since say WW2) and society would arguably have collapsed already. Wars serve a purpose where actual long-term benevolent efforts are not even suggested. — noAxioms
Disagree heavily. At best we've thus far avoided absolute disaster simply by raising the stakes. The strategy cannot last indefinitely. — noAxioms
Maybe they get smarter than the humans and want to do better. I've honestly not seen it yet. The best AI I've seen (a contender for the Turing test) attempts to be like us, making all the same mistakes. A truly benevolent AI, smarter than any of us, would probably not pass the Turing test. Wrong goal. — noAxioms
I at least thought it was good.
— ToothyMaw
That's unfortunate. — Baden
there's a sophos who can predict the future accurately. — Agent Smith
What gets me stoked is this: the skill set the OP wishes robots to have may require computing power & programming complexity sufficient to make such robots sentient (re unintended consequences). — Agent Smith
The robots would refuse to comply if they're anything like us. — Agent Smith