When I say rational, I mean that they make sense according to some sort of ethical reasoning, not that they are purely derived from reason, and are thus indisputable truths. — ToothyMaw
I'm going to second T Clark here. This is twisting the idea of rationality into something its not. People are often not inclined to be rational at all. They'll smoke, they'll drink, etc. People rationalize, but that's not being rational. Many people don't even go that far. — Philosophim
Reason also does not mean an indisputable truth. Reason simply means we have derived a conclusion from a set of premises that is certain or highly probable. It does not mean the premises used are true, and consequently, does not mean our conclusion results in an indisputable truth either. — Philosophim
I didn't expect people to attack the assumption that people often try to justify the laws they want with some forms of reasoning. — ToothyMaw
If I were to believe you and T Clark, everyone is just directionless hippies and/or irresponsible pleasure-seekers with absolutely no designs on being ethical in any substantial way. — ToothyMaw
I'm going to second T Clark here. This is twisting the idea of rationality into something its not. People are often not inclined to be rational at all. They'll smoke, they'll drink, etc. People rationalize, but that's not being rational. Many people don't even go that far. — Philosophim
With philosophy your argument starts with the very first premise you put forth. Your entire thesis statement starts with this assumption. The only thing which should be assumed is that most people are not going to let assumptions pass by without asking you to prove them. You may want to see if this assumption is unnecessary for the rest of your OP and remove it if possible. If not, I would re-evaluate your entire OP. — Philosophim
It's important to me that I treat people honorably. Sometimes I don't live up to that aspiration. The source of that isn't some formal, codified, "rational" ethical code, it's empathy and fellow-feeling. How does that make me directionless or irresponsible? — T Clark
Also: if the two of you would just read the formal argument you would realize I stipulate in (1) that only the subset of those laws that are formed by reasoning about consequences are relevant. — ToothyMaw
If one acts according to what is rational, even if what is rational to a given agent is not rational from other perspectives — ToothyMaw
The main point of your paper is that rule-consequentialism becomes more like act-utilitarianism no? — Philosophim
If one acts according to what is rational, even if what is rational to a given agent is not rational from other perspectives, does one truly have free will in a meaningful sense, given people are inclined to act according to supposedly rational rules and laws? — ToothyMaw
It seems to me that the act-utilitarian, for instance, always acts rationally when bringing about the best outcome - something I argued must always be attempted if one is to have good intentions - as the best outcome, which has the best consequences, is the only good outcome if all other outcomes have deficits of good consequences. So, the act-utilitarian must also relinquish their free will if they are to be a “good” consequentialist. — ToothyMaw
Alternatively, you might list deontology or rule-consequentialism as examples in which one can be rational by following rational, impartially defensible laws. But did you make those laws? — ToothyMaw
Given this argument holds, it appears that rule-consequentialism does indeed become more and more like act-utilitarianism as the laws get more specific, as premises (1) and (2) are granted by probably every rule consequentialist and some deontologists, too. So, if you want to make consequences matter, you have to grant that it is rational to only act in one very specific way - maximizing utility - in certain circumstances, and if you don’t like this, you have to deny premise (1), (2), or (1) and (2). — ToothyMaw
It seems to me that the act-utilitarian, for instance, always acts rationally when bringing about the best outcome - something I argued must always be attempted if one is to have good intentions - as the best outcome, which has the best consequences, is the only good outcome if all other outcomes have deficits of good consequences. So, the act-utilitarian must also relinquish their free will if they are to be a “good” consequentialist.
— ToothyMaw
This seems to be implying that free will must somehow involve you doing things at random, or for emotional, short sighted reasons which doesn't seem like an obvious premise. — Echarmion
If one acts according to what is rational, even if what is rational to a given agent is not rational from other perspectives, does one truly have free will in a meaningful sense, given people are inclined to act according to supposedly rational rules and laws?
— ToothyMaw
The obvious counter-question to this is what "free will in a meaningful sense" is supposed to be. There are people that argue that the essence of free will is the capacity to act rationally. — Echarmion
Alternatively, you might list deontology or rule-consequentialism as examples in which one can be rational by following rational, impartially defensible laws. But did you make those laws?
— ToothyMaw
I don't think anyone else but me can make the laws that are in my mind. — Echarmion
Given this argument holds, it appears that rule-consequentialism does indeed become more and more like act-utilitarianism as the laws get more specific, as premises (1) and (2) are granted by probably every rule consequentialist and some deontologists, too. So, if you want to make consequences matter, you have to grant that it is rational to only act in one very specific way - maximizing utility - in certain circumstances, and if you don’t like this, you have to deny premise (1), (2), or (1) and (2).
— ToothyMaw
The problem is that utility isn't defined here, so while this kind of reasoning is useful if you have a given value you want to maximise for, it doesn't give you that value by itself. — Echarmion
If freewill is acting rationally, and the act utilitarian has only one rational choice, would you still think that that is meaningful free will? I certainly don't. — ToothyMaw
I intentionally didn't define exactly what utility is, as the thing being maximized might vary according to each relevant law. But for the rule consequentialist utility would probably equate to welfare most of the time, and that is most relevant for that portion of the OP. — ToothyMaw
Well, I have issue with saying that laws can be only based on consequences because no one knows the full consequences of their actions. — I like sushi
it seems like I have to assume there is some hypothetical law that can be seen as unquestionably ‘the best’ law. If so then why would anyone question it. Point being rules are questioned and the kind of ‘laws’ I believe you have in mind are not ever brought into examination they are just accepted. — I like sushi
For comparisons sake, we do not question whether or not a ball will drop if released, we bring this inot question only when experience shows otherwise (ie. in outer space). Obviously we are talking about ethics here so there is far more to question here when it comes to human biases and subjective opinions soak with human emotions. — I like sushi
We can only have such a debate with ourselves in the first place via the impression of free will. — Echarmion
I intentionally didn't define exactly what utility is, as the thing being maximized might vary according to each relevant law. But for the rule consequentialist utility would probably equate to welfare most of the time, and that is most relevant for that portion of the OP.
— ToothyMaw
And so my question would be, are you "choosing" to use welfare here or is this value somehow pressed on you by externalities? — Echarmion
I hate it when people say this. Perhaps a paranoid person has an impression they are being watched. Does this impression grant any weight to her assumption that there is a conspiracy against her? Certainly not, and that goes for free will too - even if this impression is almost universal. — ToothyMaw
This is pressed on me by the assumptions most consequentialists make — ToothyMaw
It could be easily dismissed if it wasn't for the fact that you constantly assume that you have free will whenever you act - even when you're just thinking and deciding. — Echarmion
Your argument is that making a decision with a definite outcome doesn't involve free will. — Echarmion
Assumptions are not external though. Fundamentally they're only in your head. — Echarmion
I actually don't, really. I make decisions but I don't think that I truly have free choice. That I act like I'm making decisions and setting goals freely does not necessarily presuppose that I have free will.
You might argue that it is absurd to believe that I don't have free will because it looks like I do in every regard. — ToothyMaw
No, it is that if you want to be a rational rule consequentialist, or perhaps even deontologist, you must abdicate your ability to choose because of the very nature of some of the laws in place, along with premise (2). You don't have any meaningful choices sometimes if you fall into the same trap as the good-intentioned act-utilitarian. — ToothyMaw
These particular assumptions did not originate in my head. — ToothyMaw
I would argue that sounds like a performative contradiction. You say you believe one thing, but you act like you believe another. — Echarmion
The problem is that you can't abdicate that ability. — Echarmion
furthermore, there's the problem of setting and evaluating goals. Even if you have a perfect utilitarian algorithm, it can't set and evaluate goals. — Echarmion
Technically they did. Of course they may be inspired by what someone said or wrote, but taking an argument from someone else still involves understanding and interpretation, and what results is always your take. — Echarmion
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.