I have no idea how any of these words relate, at all, to anything I said — flannel jesus
I have no idea what you're talking about at this point. — flannel jesus
If I may, I think he was referencing your position that we may be permitted stupidity if. . ., not you personally. — ENOAH
’instilling faith’ if achieved, is the (temporary and temporal) settlement of that dialectic, commonly called belief and confused for not being knowledge. — ENOAH
The word [earnest] has an evolved (in both each individual and History) function of triggering the movements/arrangements of other words which eventually trigger conditioned Feelings which eventually trigger actions (more mental/or physical) . — ENOAH
All of this process seems to contain "intent" "deliberation" a "self". Hence these discussions etc. But there is no "trusting your own mind" directed by that "you". It is all just the movements of that mind — ENOAH
Antony Nickles yeah that just sounds like nonsense to me. — flannel jesus
I have no idea how a guy saying he doesn't trust his own reasoning could be interpreted as "political" — flannel jesus
If he doubts his own ability to reason, and his own ability to reason leads him to think he should trust science, then OF COURSE he's going to doubt if he should trust science. Just read his words. He spells it out, I'm not speculating. He literally says he doesnt trust his own reasoning abilities. — flannel jesus
This thread is about trusting your own mind, trusting your own judgment, trusting your own ability to reason - the thread I linked is about a guy who says he can't trust his own ability to reason. It's entirely on point. — flannel jesus
Ok, then is [earnestness], not in the speaker, but the receiver? The receiver interprets the committed "action" as earnest? Hence, speaker's intention is irrelevant? Where I'm currently settled is that (notwithstanding my previous "flippancy") "earnestness" is neither in the speaker (intent) nor in the receiver (interpretation) and (perhaps frustratingly to our conventional logic) it's in both. Why? Because it is imbued in the "word." — ENOAH
this guy's post is also an epistemological problem. — flannel jesus
The guy who made the thread, somehow, came to distrust his own ability to reason and discern fact from fiction. — flannel jesus
You should NOT trust your mind, but you can gain trust in certain beliefs by applying critical thinking: seek out contrary opinions, test your beliefs through discussion with others (like on this forum), attempt to mitigate confirmation bias by trying to identify objective reasons to support or deny some presumption you may have. Learn at least some basics of epistemology (including the limits of each technique). — Relativist
I can't disagree — ENOAH
I am discussing my thoughts approached at different "layers" and am poor at articulating that. — ENOAH
I still stand behind the "essence" of my thought…. I don't abandon my general thinking… to show you… [ I am ] earnest. — ENOAH
If the speaker is speaking in earnest*, who am I to judge? Why would I deny myself the opportunity to "play ball" with anyone who truly just wants to play ball? — ENOAH
I would "argue" there's a false bar for most, if not all words, not just earnest etc. — ENOAH
If the speaker is speaking in earnest, who am I to judge? [on a litmus test]… That is, "earnest" is related to "intention" — ENOAH
I'm not convinced that the desire for a universal principal is simply the result of us wanting to shirk our responsibility or culpability. — Benj96
Everyone can be rash, everyone can be stupid, misinformed or otherwise malpracticing adequate reason. My question is how does one know when that is the case - ie they're chatting sh*t. And to the contrary, when they really do know what they're talking about. What is the litmus test in the realm of discourse with others which may be either just as misinformed or very much astute and correct? Is there a universal logic/reason? Or only a circumstantial one? — Benj96
…[an animal’s] inability for it to question its existence or purpose does not alleviate guilt on my part then I should be grateful for the food put on my table.… At what point does a human being rationalize its consumption? — Deus
But I think people's inclinations can be affected by arguments… on average, more truthful arguments receive some advantage from their truthfulness. — xorn
it is not me making a judgment about people; I am just describing how disclaiming belief works in the world. And I’ll consider a competing claim, but dismissing the entire project as impossible claiming that I’m in no position is to remove any rationality from philosophical discourse. If someone is claiming they don’t believe in God, in a certain sense they are saying there is no mystery in the world and nothing outside of (above) our power. Now, they might not want that to be the implication of it, but those are some of the things which are believed, and so some of the things which are refused in the denial.That's one hell of a big inference about a whole hell of a lot people you know nothing about. — Vera Mont
You were drawing out the inference you made of what I said. Your interpretation. — Vera Mont
Before every such statement [“I think there is a god." or "I believe there is a god." or "I believe in God."] there is an expressed or implied question. — Vera Mont
the statement points back to a requirement for making it. — Vera Mont
you might want to consider if there's a charitable interpretation of the original post that could resolve this apparent inconsistency. - GPT-4 — Pierre-Normand
I believe this is one of those misconstructions through the substitution of similar but not interchangeable words. The words 'slippery', amorphous' and 'ever-changing' do not mean 'irrational'; nor does 'difficult' mean 'unable to be clarified'. — Vera Mont
…subject to imprecise applications and interpretations. — Vera Mont
[“I believe in God”, “I think there is a god”]…are …separate uses …in the same context: answering the question: "How do you regard God?" — Vera Mont
Language is slippery; difficult to handle effectively. I doubt any hard rule can apply to all the words in one language — Vera Mont
Now, why did you change the example? — Vera Mont
If God comes into it, it should be by way an example such as: "I think there is a god" - uncertainty leaning toward belief - "I believe there is a god" - growing conviction - and "I believe in God" - declaration of faith in a particular deity. — Vera Mont
My point is that the 'AGI', not humans, will decide whether or not to impose on itself and abide by (some theory of) moral norms, or codes of conduct. — 180 Proof
I suspect we will probably have to wait for 'AGI' to decide for itself whether or not to self-impose moral norms and/or legal constraints and what kind of ethics and/or laws it may create for itself – superceding human ethics & legal theories? – if it decides it needs them in order to 'optimally function' within (or without) human civilization. — 180 Proof
but I am the only one who can bind me to my word. if you bind me to my word, you still do not know what is going to come out of my mouth. — Arne
I don't believe that ethics is characterized by rule following — 013zen
If an AI ever feels something that we might characterize as an internal conflict regarding what makes the most sense to do in a difficult situation, that will affect people's lives in a differing but meaningful manner, then perhaps I might consider it capable of moral agency. — 013zen
But then your argument seems reducible to putting safeguards in place so we can all sleep better at night. . . and relieve ourselves of any moral responsibility for the results of bad actors. — Arne
I suspect we will probably have to wait for 'AGI' to decide for itself whether or not to self-impose moral norms and/or legal constraints and what kind of ethics and/or laws it may create for itself – superceding human ethics & legal theories? – if it decides it needs them in order to 'optimally function' within (or without) human civilization. — 180 Proof
That [ AI ] can only consider novel situations based on already established laws is no different from how a human operates. — ToothyMaw
I don't see anything preventing an AI from wanting to avoid internal threats to its current existence from acting poorly in the kind of situation you consider truly moral. — ToothyMaw
I don't quite agree that many moral philosophers would consider you moral for following just any self-imposed rule, if you are saying that. — ToothyMaw
Doesn't it matter though if the AI can choose between affecting a moral outcome or a less moral outcome like one of us? …shouldn't we treat it like a human, if we must follow through with holding AIs responsible — ToothyMaw
we can just change the programming so that it chooses the moral outcome next time, right? Its identity is that which we create. — ToothyMaw
It seems to me that we are the ones who need to be put in check morally, not so much the AIs we create. That isn't to say we shouldn't program it to be moral, but rather that we should exercise caution for the sake of everyone's wellbeing. — ToothyMaw