By your reasoning, our willful actions can never be wrong. If you do something in fulfillment of your desires, that moves you closer to a state in which you will no longer have those desires and thus no motive to perform any further action - a stable state. — SophistiCat
And since we know right and wrong, we know that your theory has to be wrong just for that reason alone. — SophistiCat
So, let's explain it from a completely different perspective. Let's start with a subjective value system which does not have a logically necessary subjective goal:
1. person A has a goal
2. therefore some things are desirable to person A (subjective normative statement)
The only problem of that system is that there is no way to choose one goal over the other since the goals themselves define what is better over other. The desirability of things is based on persons arbitrary choices.
The only new thing this system adds to that is a logically necessary goal.
1. person A has a logically necessary goal
2. therefore some things are necessarily desirable to person A (subjective normative statement)
In this system the desirability of things is not based on persons choices and therefore the desirability of those things for him does not need to be justified. — Qmeri
a system with which one can make unarbitrary value judgements — Qmeri
2. therefore some things are desirable to person A (subjective normative statement) — Qmeri
We still have a functional need for an unarbitrary system to make value judgements. This system provides that. — Qmeri
"Act such that you ensure you consume the largest amount of cheese possible" is another system that does that. I don't think that would pass as a moral system though — khaled
No it isn't. This isn't a normative statement. Check this http://www.philosophy-index.com/terms/normative.php . This is a statement of fact. Some things are indeed desirable to person A.... So what? An answer to the "So what" is a normative statment Ex: Thus A should seek those desirable things. "Some things are desirable to A" is akin to "The sky is blue", it is a statement about a property of an object — khaled
No that doesn't make unarbitrary value judgements since the whole premise is arbitrary — Qmeri
If you agree that we have a logically necessary goal, then you should also agree that it does not need to be justified like other goals. No matter how obvious and trivial you say it is, the fact that it does not need to be justified as a choice is not obvious to most people — Qmeri
And the fact that you can derive all your other goals and desires by choosing them as much as they are choosable to serve it and it's optimal achievement is also not obvious to most people — Qmeri
But this "derivation" will be different from person to person and from circumstance to circumstance. Without some guidance or rules (which you can't justify) this system will end up with unjustifiable conclusions as well. — khaled
Sure it has the "functional equivalence" of objective moral systems in that it tells you what to do but it's so vague it doesn't actually help. It's like trying to extract some morality out of cogito ergo sum for example. — khaled
And since we know right and wrong, we know that your theory has to be wrong just for that reason alone. — SophistiCat
Except your willful actions can still be wrong. If you make an action that makes you temporarily more stable, but that decreases your stability in the long run, you have objectively made an error according to this system. — Qmeri
There are no free lunches in philosophy any more than in real life, and I believe the cheap ones aren’t worth the money. We might as well go for the real stuff and pay the price. — Jaegwon Kim
khaled is absolutely right: your "system" doesn't help us make decisions, it just claims to make an objective statement about decision-making in general. It is not a system of ethics, because it cannot prescribe any course of action. — SophistiCat
But this "derivation" will be different from person to person and from circumstance to circumstance...that people seek stability will not tell you anything beyond that. It won't tell you whether or not the best way to achieve stability is through self centered behavior, charity, communism, capitalism or what — khaled
Well the derivations can be justified from circumstance to circumstance. It's just complicated, not undoable — Qmeri
Especially since very simple but not vague moral rules have been shown by history to not work very well — Qmeri
The details of what this goal system gives to any person is an empirical scientific question — Qmeri
Just because a system is very complicated doesn't mean that the system is unhelpful. — Qmeri
like utilitarianism which is almost as complicated and vague as my goal system, but you are not complaining about that, are you? — Qmeri
1. it gives a personal goal for everyone, not universal goals — Qmeri
2. it avoids the problem of justifying the choice of this goal by showing that it is unchoosable and therefore doesn't need to be justified as a choice. — Qmeri
But it seems like that you will not accept that people have this unchoosable logically necessary goal. — Qmeri
That is the same with your system. I understand that you begin from a premise that's true by definition, but the problem with moral systems is rarely that the system is unjustified but more so that it's hard to go from ANY vague premise to concrete reality. — khaled
justifying your choice of goals — Qmeri
"what actions would make me the most satisfied in the long run?". Since people have much more information about themselves than the world as a whole, such question is much easier to turn into concrete actions than something like utilitarianism. — Qmeri
what actually would increase capitalism is a question that needs expertise. — Qmeri
"what actions would make me the most satisfied in the long run?" — Qmeri
This doesn't do so either. It says it's a necessity. That's all it does. That's different from "justifying". — khaled
That question is LITERALLY what a utilitarian would ask though — khaled
Not anymore than — khaled
In the very same responte, khaled says that my system prescribes a course of action for every circumstance - just that it does not give simple universal courses of action like "be charitable" irregardless of circumstance. — Qmeri
My system is about personal satisfaction - similar, but still majorly different. — Qmeri
Are you really saying that a non-expert would be able to do as good decisions regarding capitalism as what makes him satisfied? — Qmeri
Therefore this system is very easy to turn into practical solutions. — Qmeri
Yes, he would botch both — khaled
What if my personal satisfaction requires shooting people? — khaled
This system is not about straight up solving the is-ought gap since I think that it is unsolvable. This system is about bypassing it by giving a functional equivalent to an objective moral system with a system that gives a necessary personal goal for everyone the choice of which doesn't need to be justified since it's not a choice. It is all about whether this goal of "stability" is choosable. — Qmeri
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.