In other words, we reward and penalize certain behaviours as a form of conditioning, like training dogs to behave a certain way. This objection is addressed in the video. This explains our rules on a societal level but it does not explain why we praise or blame people on a personal level. There's a difference between judging a dog and judging an (adult) human for acting badly. We judge the dog as poorly trained, but we judge the human as being morally wrong or evil.It is appropriate to hold oneself (or another person) accountable for a bad act because we know he could have chosen not to do it. Here's how he could have: if he had a stronger disposition to do good, [...] he would not have done it — Relativist
Conceiving valid thought experiments is not impossible. For one thing, we don't need to simulate every factor; we can just use the mental factors we currently have (current mood, beliefs, etc.) and only need to imagine a simple situation.It's PRECISELY why we perceive that we could have made a different choice. To deny this, you would have to assume that fantasizing about a past choice entails a perfect duplication of the mental conditions that led to the decision. If it is NOT perfect, then it is not a valid basis for claiming this is a good reason to believe a different choice was actually possible. — Relativist
I can only grant you that LFW came from something other than deterministic laws. After that, to claim that this something must be an omnipotent, omniscient, omnibenevolent God is jumping to conclusions.If there's no God, then human life came to exist as a product of deterministic laws of nature. A deteministic law cannot be a source of ontological contingency. — Relativist
Can you further explain what you mean by "initiate"? For example, when a computer boots up because a person pressed the power button, who initiated the event—the person, the computer, or something else?It sounds ludicrous to claim I do not initiate the raising of my arm. — Relativist
Because of our moral sensibilities- the emotions we feel when considering the acts.This explains our rules on a societal level but it does not explain why we praise or blame people on a personal level. — A Christian Philosophy
Do dogs have moral sensibilities? Do they have empathy? Do they have vicarious experiences? Do they have moral beliefs? I don't think so, and this means it's extrememy different.In other words, we reward and penalize certain behaviours as a form of conditioning, like training dogs to behave a certain way. — A Christian Philosophy
You're making excuses for treating the thought experiments as evidence for ontological contingency. "It seems like we could have chosen differently, therefore we could have chosen differently."Conceiving valid thought experiments is not impossible. For one thing, we don't need to simulate every factor.. — A Christian Philosophy
Your scenario is contrived is ridiculously simplistic and it ASSUMES what you're trying to prove: LFW. You erroneously assume moral "motives" can't exist under compatibilism, you ignore the many complex factors involved with developing our various tastes, wants, and even our beliefs about morality. I described some of the details on my last post, and you simply ignored it. Did you even read it?Additionally, as described in the video, we perceive freedom differently between cases with only one type of motive (e.g. ice cream vs ice cream) and cases with multiple types of motives (e.g. ice cream vs charity). In the latter, we perceive to be free, where as in the former, we do not. — A Christian Philosophy
This is problematic, because there's no evidence of any causally efficacious factors in the world that are NOT deterministic, except for quantum indeterminacy (which you rejected). But if QI is involved with mental processes, it only introduces randomness. So there's no basis to support the claim that we are somehow a source of ontological contingency. This is exactly the reason compatibilism was developed, to show that the perception of free will was compatible with determinism.I can only grant you that LFW came from something other than deterministic laws. — A Christian Philosophy
Of course not. There's no reason to think an OG has the capacity for intentional behavior and to make decisions.As a side note, would you not agree that an OG would necessarily have LFW? — A Christian Philosophy
I don't know what you're looking for, because it seems self-evident. So it would be best if you describe the process as you perceive it during the act, . Needless to say, don't assume LFW in your description, because that's a post-hoc interpretation. IOW, describe what you are thinking, and the relation between your conscious thoughts and your brain stimulating the nerves in your arm that makes it perform the action.Can you further explain what you mean by "initiate"? — A Christian Philosophy
We can feel emotions about dog behaving badly as well, and judgement does not follow from mere emotions. The fact is that we consistently judge humans to be morally evil when misbehaving (not dogs), and moral evil is not compatible with determinism.Because of our moral sensibilities- the emotions we feel when considering the acts. — Relativist
I think you misunderstand my argument, as I do believe that moral motives can exist under compatibilism, and this is not my point. But I don't want to spend more time trying to clarify it, so let's drop it.You erroneously assume moral "motives" can't exist under compatibilism, you ignore the many complex factors involved with developing our various tastes, wants, and even our beliefs about morality. I described some of the details on my last post, and you simply ignored it. Did you even read it? — Relativist
You forgot your original point of this topic. Pasted below. I'm just responding to the objection that God must exist if we have LFW. After that, yes, LFW is not compatible with determinism and I reject determinism.This is problematic, because there's no evidence of any causally efficacious factors in the world that are NOT deterministic — Relativist
OK, let's not assume God. Early in the discussion, you agreed that ontological contingency requires a source of contingency. If there's no God, then human life came to exist as a product of deterministic laws of nature. A deteministic law cannot be a source of ontological contingency. Case closed. — Relativist
The OG's actions cannot be determined from prior causes, being the first cause. So if its actions are also not free, then what are they?Of course not. There's no reason to think an OG has the capacity for intentional behavior and to make decisions. — Relativist
Okay, I will assume determinism and not LFW. When a person raises their arm, all the mental processes originate from factors outside the person, just like the computer booting up originates from a person pressing the power button. In both cases, the event is not initiated by the object acting the event. Under determinism, the only initiator is the OG, and every subsequent object is just a cog.I don't know what you're looking for, because it seems self-evident. So it would be best if you describe the process as you perceive it during the act, . Needless to say, don't assume LFW in your description — Relativist
You must be making some unstated assumption about the nature of morals. The presence of moral intuitions is perfectly consistent with determinism (and materialism).moral evil is not compatible with determinism. — A Christian Philosophy
moral evil is not compatible with determinism.
Yes, I read all your posts. I don't comment on every line because that would take too long, but in general, my view is that adding more determined factors to the explanation does not resolve the issue. — A Christian Philosophy
My original point was that ontological contingency needs to be accounted for ontologically:You forgot your original point of this topic. — A Christian Philosophy
Best guess is that it would be a quantum system, so the actions that ensue would be the product of quantum indeterminacy. What that implies is dependent on the actual nature of QM - i.e. which interpretation is correct \The OG's actions cannot be determined from prior causes, being the first cause. So if its actions are also not free, then what are they? — A Christian Philosophy
You are missing the point! Make no assumption at all, and just explain what seems to be going in in your mind. We ought to be able to agree on what seems to be going on. The question then becomes: how do we explain this sequence of events with LFW vs compatibilism?I will assume determinism and not LFW. — A Christian Philosophy
I genuinely don't know what you are asking if I did not do it correctly last time. Can you do it first? Then I will do the same.You are missing the point! Make no assumption at all, and just explain what seems to be going in in your mind. — Relativist
The arm lifting is caused by the firing of neurons. Why do you call it "initiated"? It is not the start of a causal chain, or the start of a branch of a previous causal chain.We know the arm-lifting action is initiated by the firing of neurons — Relativist
Sure. And in a system of cogs A, B, C, cog B is also important for cog C to move; but it is not an initial step.The fact that I made the decision is important, because without that - I wouldn't have lifted my arm. — Relativist
So...even if LFW is true, there was no initial step?I still see no distinction in any of the steps to make one of them the initial step. — A Christian Philosophy
I thought we were setting aside any mentions of LFW/compatibilism. :wink:So...even if LFW is true, there was no initial step? — Relativist
Not under compatibilism. Since your decision was determined, we could say the factors demanded (better yet, compelled) that you raise your arm.There were external influences, such as the discussion we're having, but no one else demanded, encouraged, or even suggested I raise my arm at that time. — Relativist
As previously stated, this does not imply agency. Cog B, and only cog B, is the direct cause of the movement of cog C, yet cog B is not an agent.My decision, (and only my decision) was the direct cause of the arm lifting. My thought processes (and only my thought processes), was the direct cause of the decision. — Relativist
I mentioned it only to remind you that we're establishing a scenario that does not presuppose either LFW or compatibilism. You had said, "I still see no distinction in any of the steps to make one of them the initial step."I thought we were setting aside any mentions of LFW/compatibilism — A Christian Philosophy
LFW or compatibilism are not presupposed. As quoted below, you said that intent implies agency. I responded that it does not if there is no initial step.I mentioned it only to remind you that we're establishing a scenario that does not presuppose either LFW or compatibilism. You had said, "I still see no distinction in any of the steps to make one of them the initial step." — Relativist
Acting with intent implies agency. — Relativist
You had asked me:LFW or compatibilism are not presupposed — A Christian Philosophy
Which step in the process is the initial step? — A Christian Philosophy
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.