Comments

  • Pointlessness of philosophy


    On the question of definition, there is a problem that if you insist on a clear definition of terms as the beginning, you are quite likely to end up arguing about the definition and never getting to the substantial issue. Yet it is also true that disgreements can often be resolved or at least clarified by clarifying terms. So definitions can be useful. At the beginning of a discussion, they can serve as axioms, to be questioned later or on another occasion. During a discussion, they can be useful as a way of resolving merely linguistic issues. But they need to be treated as useful rather than necessary.
  • Atheist Dogma.


    Before I get to the economy, some thoughts on the question what distinguishes humans from other animals. Each species of animal is similar to and different from every other species. It's not very exciting, philosophically speaking.

    I suspect that even though other animals are perfectly aware that they are different from other animals, they don't care very much about that. It's not just that they can't (so far as we can tell) because they don't have a language like ours, but what they care about is whether the other animal is something to eat or be eaten by, whether there is competition for food and living space and so on. Cross-species friendships are not unknown.

    The question has a long history. In some ways, of course, it is like the questions that other animals can ask an answer, but it seems to carry some weight for humans that it doesn't carry for other animals. I don't really know the answer, but I fear that it may carry the some unspoken weight:-

    1 the weight of assuring us humans that we are superior in some moral sense to other animals (as shown by the fact that "animal" is used as a term of abuse in some contexts) and that the reason people sometimes feel that other animals are superior is because of their "innocence" (cf. children), which is ambiguous praise

    2 the weight of justifying our dominance over other animals in the sense of justifying our abuse of them or at least our practices of treating them as means to our ends.

    We don't think very much, do we, about the various ways that various animals are superior to humans? We think they are not important. That's telling. Like me feeling superior to you because of X, while failing to acknowledge that you are superior to me because of Y.

    So I'm uneasy about this game we have got into. Still:-

    It is true that animals don't have money and live in an ecology rather than an economy. Money is a key feature of economics; so is property. Neither can exist without the law.

    But what makes money work is not the law, but people's confidence in it as a store of value. It doesn't matter whether the sea shells are fake or not. What matters is whether people are confident that they will be able to exchange them for "real" things, like food and shelter at some point in the future.

    Don't think of money as value, think of it as a symbol - a claim - on resources. We don't value the empty promise to "pay the bearer"; we value the promise of being able to obtain the things we want and need.

    I'm hopelessly idealistic because I think that every citizen has 1) a claim to a basic standard of food, shelter and other necessities irrespective of how "useful" they are and 2) a duty to contribute to the shared costs and labour of the social organization they live in. (And every human being has a right to be recognized as a citizen of some society/nation.)

    I don't think those principles are left wing or right wing. They are principles of enlightened self-interest.

    PS Of course, human beings are special in all sorts of ways. But I'm human and so inclined to pay special attention to them. My problem is that I don't understand what the significance is of the differences and similarities between humans and other animals.
  • The beginning and ending of self
    I think if we could agree that there has to be a continuation of consciousness in some form for the narrative self to continue, and that consciousness can continue without the narrative when the tale is 'completed', and that this completion and continuation is very rare in this world, then that is all I would seek to defend as my belief here.unenlightened

    That seems like a good summary, but I worry about complications.

    Ironically, emotional investment (cathexis) in one's own story may cause us to fear (pre-mourn) the end of the narrative & narrator.Gnomon

    There are cases where fear and pre-mourning may not happen, don't you think?
    People who risk their lives sometimes seem, at least not to fear or pre-mourn their death. You might argue that's not really the case and some of them may be putting on a brave face; I wouldn't want to rule out the possibility in advance.
    People who are dying slow agonizing deaths may welcome the end and even choose to walk before they are pushed, so to speak. There may well be fear there, but the mourning appears to be more for the process of dying than the death.
    I guess you're saying that fear and pre-mourning are the usual, normal, default situation. Maybe.

    That painful bummer in the middle of the story has been evaded by ancient sages in various ways : acceptance, denial, sequel in heaven, etc.Gnomon

    I agree that for most people that is the usual situation. It depends what you count as the end of the story, and maybe whether each person's story can consists of several episodes, link by continuing consciousness. There is an alternative:-

    I think if we could agree that there has to be a continuation of consciousness in some form for the narrative self to continue, and that consciousness can continue without the narrative when the tale is 'completed', and that this completion and continuation is very rare in this world, then that is all I would seek to defend as my belief here.unenlightened

    But some would have us imitate the innocence of animals by living in the moment, and ceasing to explain & judge ourselves as protagonists in the Self-story.Gnomon

    I agree that the innocence of animals implies no judgement. But whether we can cease to explain and judge ourselves" only by imitate (acquiring?) the innocence of animals is another question.

    But for humans, that would mean losing the most important thing in the world, Me. :smile:Gnomon

    I wanted to respond that of all the things in the world that you cling to, your self is the one thing you can't escape, for better or worse. But One can lose oneself in a number of ways. Temporary loss by absorption in some activity or spectacle. Episodic loss by multiple personality (though I admit that is a contested concept). Permanent loss by amnesia. Loss by life change, as in becoming a priest or a monk or other major change - would entering witness protection count?

    There is much to be said for the narrative about the self-narrative (in one form or another). But isn't it a mistake to mistake it for the whole story? The many varieties of narrative and the many disruptions of narratives that get into self-narratives show that they cannot be the whole story.
  • The beginning and ending of self
    One always stands outside the narrative to describe it, but it is always oneself one is describing so it is always a narrative self (or a log-book self) and one is never outside itunenlightened

    Yes. So one is always two selves. Or perhaps one self stands in two incompatible relationships to the narrative. Or perhaps there is no outside and no inside because that's a metaphor which is misleading in this context.
  • The beginning and ending of self
    H'm. I need to go carefully here. Which way is straight ahead?

    One cannot do philosophy and Zen at the same time, except perhaps in something like the manner of the early Wittgenstein. (The unanswered (and unasked) question is whether he continued that way in his later work. But that tells us nothing.)

    If you do not answer, you go straight to hell, but if you answer you continue the fictional map.unenlightened

    If I go straight to hell, I continue the fictional map. Unless I'm already there.

    I think it is unnecessary.unenlightened

    I'm not sure about that. What does "necessary" mean here? Some Western philosophers have propounded the answer that there is no self. Buddhism is quite clear about why it is necessary (and how it is not).

    I can talk about narrative, though. Here's what bothers me.

    What kind of narrative are we talking about here? Whose narrative are we talking about? (You said mine, but I can adopt someone else's and I will probably have more than one narrative about myself.)

    Narratives are often disrupted. Sometimes someone else's narrative collides with mine. Sometimes I disrupt my own narrative, whether deliberately or accidentally. Sometimes "events" disrupt my narrative. We can modify our narrative or throw the old one out and make a new one. Whatever we say or do, there is always something "outside" our narrative and narratives are never permanent, even when we are dead. What are we to make of this?

    I distinguish between a narrative and a log book. A log book is a series of dots. A narrative connects those dots. Is a log book a safe and satisfactory option?
  • The beginning and ending of self


    If your map has no territory it is not a map. Or if it is a map, it is a fictional map and consequently not your narrative.

    I'm tempted to suggest a Zen cure. Go for a walk, have a cup to tea and a good night's sleep. Or perhaps Hume's cure for scepticism would suit you better. According to him few days' living a normal life would sort you out, though he clearly preferred a game of billiards.

    Modern philosophers still sometimes fall into the error of thinking that all there is, is language. They forget that language consistently, insistently, point "beyond" itself. There is no beyond, it is just that language exists in a world, which continually impinges on it. Your narrative is about something outside the narrative - you - and it is continually broken into and messed about by reality. That's what sits behind my nit picking. How about you are what disrupts your narrative?

    I do take you seriously, but straightforward argument is not going to get you back to normality, is it?

    Yes, I have been there. There is a life afterwards, when you get the right balance.
  • The beginning and ending of self
    self fades in and fades out, but is always the same, except on the Dark side of the Moon.unenlightened

    Your account of a train of thought as we experience it is a good one. The saving grace is that a train does at least connect internally and can be connected externally. So it's different from the stream of consciousness. I like that.

    Whether the self is always the same is a good question. Your account of the train of thought suggests not, doesn't it?

    I'm not sure there's a good definition of the self, apart from whatever I recognize and/or assert (which may be inconsistent!).
  • Atheist Dogma.
    awful evolutionary psychology isMoliere

    There's much I don't know about it, but, like many other people, I encountered sociobiology (E.O. Wilson) when it was fashionable. I live in liberal - even woke - circles. Do I need to say more? But I'm not assuming that that's all there is to evolutionary psychology; I do assume that it's not appropriate to speculate without a good understanding of the field and evidence.

    they have rigid practices, but since they do not need enforcing then that's not an example of law.Moliere

    Sorry, I wasn't clear. I agree that ants and bees don't have enforcers, so don't have laws.

    That's similar to law, but not quite the sameMoliere

    Mammals do have a kind of enforcement, but they don't have legislation (partly, they can't have that because it requires writing). But there is at least one alternative. The Icelandic parliament used to appoint people to memorize the laws and recite them when the Thing (Parliament) met; I believe that was down to widespread illiteracy and the scarcity of writing materials. Anyway, what matters here is that somehow you have to ensure that everyone knows what the law is; writing, on stone or paper or whatever, is good. But if that's not practical, appointing a Memory person can fill the gap. I conclude that we have identified two things that are essential to human law - one is a means of enforcement and the other is a reliable source for what the law is.
  • The beginning and ending of self
    Paradoxically, your narrative gives continuity even as it suggests discontinuity, of the approximate form - I am awake, then I sleep and then I am awake and then ... and that is my actual life.unenlightened

    That clarifies a good deal. Sleep and unconsciousness are not really interruptions, but part of a narrative - a cycle in the case of sleep, and an incident in the case of unconsciousness. Fair enough.

    Still, it seems to me that when you speak of a narrative, you don't mean a log of my experiences, but something more structured with successes and failures and diversions and so on. Is there a reason why we can't find more than a single narrative in our lives?
  • The beginning and ending of self
    I think if we could agree that there has to be a continuation of consciousness in some form for the narrative self to continue, and that consciousness can continue without the narrative when the tale is 'completed', and that this completion and continuation is very rare in this world, then that is all I would seek to defend as my belief here.unenlightened

    There's a lot here I agree with. Narratives are an important part of how we think of ourselves and the world around us. But aren't there complications? For example, biographies are actually constructed by a biographer who selects and arranges; often the unity of their narrative is broken up by themes and/or episodes. Perhaps a third person's narrative about me is not what you have in mind. But autobiographies are not reallly any different. In any case, I'm not sure that anything much unifies my actual life apart from the continuity of my consciousness (I'm being generous there, since sleep and unconsciousness are interruptions in some way.)

    Perhaps you mean the narrative I construct as I go along, even though I may forget or abandon those drafts?
  • Atheist Dogma.
    And actually I am hesitant to utilize evolutionary explanations for our emotional life,Moliere

    I think evolutionary explanations are useful from time to time. But to think they are THE explanation is to fall for the myth of origins (Derrida? or someone else?). We are equipped with ears, and their evolutionary usefulness is the best explanation (short of an account of the evolutionary process) that I can think of. But what we make of them is a different matter.

    "Species" is not a hard categoryMoliere

    Yes. I once read "Origin of Species" all the way through. The biggest takeaway for me is that he spent vast amounts of time arguing that species are not hard and fast; he argues it every which way he can think of. It is the foundation of evolutionary theory. What's more (as Darwin points out) we mostly know it already. Evolution takes our practice of selective breeding and pushes it through centuries and millennia.

    do dolphins have laws?Moliere

    I don't know enough about them. Bees and ants seem to have rigid practices which do not need enforcing. Mammals are more complicated and do seem to need to enforce the rules - which are made and enforced by the alpha dog/lion/chimp. Are they sufficiently like laws to count? I'm not sure whether it is important to give a definite answer. Perhaps noting the similarities and differences is enough.
  • Atheist Dogma.
    Oh, and no joke -- I thought you were uncertain about the locution since it invokes various meanings, but your later post suggested that you were uncertain about the concepts, so I thought I was off-base.Moliere

    If you mean the concepts of "wary", "fear", "anxiety", you were right, not in the sense that I don't know what the words ordinarily mean, but in the sense that I was working out what to say about them in this philosophical context.

    Hence, there was no blunder on your part. I couldn't see why you thought it was a blunder, which suggests something that you should have avoided. That back-and-to was, for me a normal part of the process.

    Being an auto-didact is neither here nor there. I'm out of date. Hopefully, we're both learning. That's the point of the exercise.

    Writing in the big sense is the cliche: Everything is text. Writing in the small sense is what we're doing to communicate as homo sapiens -- with words we usually recognize as writing.Moliere

    That makes sense. How far it interprets Derrida, I couldn't say. I read some of his earlier work carefully and thought it made sense, at least in the context of Wittgenstein. The later work lost me completely and I had other preoccupations, so I never read it carefully.

    If you are a master of interpreting texts, everything is text. But isn't that like thinking that everything is a nail because you've got a hammer?

    I have a prejudice against "what differentiates us from other animals". I'm constantly finding that proposed differentiations don't work. As in this case. A dog interprets certain of my behaviours as threatening and others as friendly - or so it seems to me. (They are also like a horse and not like a horse). Animals are both like humans and not like humans, in ways that slightly scramble our paradigm ideas of what a person is (i.e. a human being). So, philosophically at least, slightly confusing. Mammals are seem to be more like us that fish or insects, never mind bacteria and algae. Those living beings seem so alien that it is much harder to worry about what differentiates "us" from "them". Yet they are like us (and the mammals) in many ways - the fundamentals of being alive apply to them as well. (But what about whales and dolphins?)
  • Atheist Dogma.
    Ah! Then another blunder on my part here.Moliere

    :smile: I'm going to take that as a joke.

    I want to simultaneously maintain the distinction between Writing and writingMoliere

    I'm sorry. I haven't heard that distinction before. Could you explain, please?
  • Atheist Dogma.
    I'm content with changing the locution from "wary" to something elseMoliere

    I didn't mean to suggest that. On the contrary, I think that "wary" is perfect (as near as one ever gets, anyway).

    Thus the rationale that we make for what plants do because the ones that didn't died out.unenlightened

    Yes. But that doesn't mean there's anything wrong with it. After all, who else is going to make explanations and seek to understand?
  • Atheist Dogma.
    Thanks for the link. I'll have a look.

    By the way, I'm still thinking about "wary". It's not the same as fear or anxiety, not obviously an emotion or a mood, more like a policy. https://www.merriam-webster.com/thesaurus/wary defines it as "having or showing a close attentiveness to avoiding danger or trouble". The lists of synonyms and antonyms is interesting. No emotions or moods occur, yet clearly "fear" and "anxiety" are related.

    Is "meh" a feeling? The feeling of not having a feeling?unenlightened

    That fits with my impression. But I'm not at all sure I've really understood it properly - which may be framing it wrongly. My impression includes the impression that it is as much a speech act as an emotion.

    I could almost define anxiety as the fear of fear, but I wouldn't defend that if it doesn't fit.unenlightened

    That makes sense.

    I think the standard distinction between anxiety and fear in academic discussion is that anxiety is said to be a mood, rather than an emotion. Part of the difference is supposed to be that anxiety doesn't necessarily have an object, whereas fear does. I tend to think of it a fear looking for an object. But that's not the whole story. If I'm anxious about rising prices, it's not the same as fearing them. Perhaps because the danger is a possibility/probability rather than real.

    But the verbal dimension compounds this fear through the imagination.Moliere

    That seems perfectly true. But there's a big and difficult problem, compounded by the idea that emotions are introspectible, so that second/third parties have limited authority. Yet we do not accept first person reports as entirely authoritative. This ambiguity fuels the difficulty in understanding the rationality of emotions. The problem is particularly acute when we want to apply the framework of emotion where language is missing (yet the framework of action is at least partially applicable). I'm talking about what some people call embedded beliefs.

    Shades of grey, on the border between categories. Partly empirical, partly conceptual. Hence difficult for philosophy. Nonetheless, important for understanding human beings.

    The fear is still there, of course, otherwise the thrill wouldn't be there.Moliere

    Yes. It is convenient for understanding this that adrenaline supports both fight and flight. Hence the term "adrenaline junkie".
  • Atheist Dogma.


    Thanks. I'm afraid I have a problem - I don't know which thread is your other thread.

    And this primary division persists in every feeling and every judgement being positive or negative.unenlightened

    Isn't there a third possibility? Neither positive nor negative, i.e. irrelevant to me.

    the dis-ease of armchair philosophers rather than rock-climbing philosophersunenlightened

    Perhaps. I would hope that a rock-climbing philosopher would be at least somewhat fearful. It shouldn't be a surprise if there were few anxious people among them. Anxious people will tend to avoid rock-climbing, won't they?

    I fear I'm nit-picking.
  • Atheist Dogma.
    Only that it's curious that it does do so, given how there's so much we do not know (and it can even be fun to not know), and a lot of what we do not know doesn't matter to us, and how even after we know the imagination can continue its anxiety spiral regardless of that desire for knowledge being satiated.Moliere

    Does it help to say that we have to start somewhere and the things around us and affect us are not a bad place to start? But there's always more to be known and so anxiety is always a possibility.

    All off-topic to atheist dogma, but I found the topic interesting to continue. Sorry un.Moliere

    You're right. I've taken a lot more interest in this kind of discussion since I read Cavell, especially on the question what lies behind scepticism, since it seems impossible to put it to bed (or, better, the grave.)

    For the modern Humean such stories are thought to be nothing but falsity, but this non-factual understanding is a part of their attraction, I think.Moliere

    We live by and in stories. Arguably, it's the first kind of understanding and even science has one. (It's called history, but it serves the main purpose of orienting us towards life).

    When I was young and knew everything, I was what you call a modern Humean. It took a long time and much actual life to get the point.
  • Atheist Dogma.
    Though have you ever wondered why not knowing makes for anxiety?Moliere

    As a person with a (mild) tendency to anxiety, I have never wondered that.

    What relieves my anxiety is not so much feeling in control as confidence that I can adapt to whatever happens and partly by feeling that most outcomes don't matter much. (Some people think I'm easy-going!) Getting absorbed in philosophy helps - and quite a lot of other things, as well.

    Anxiety gets bad when you speculate on possible outcomes and can't work out what you would do, but feel that you couldn't cope with it. Then a vicious spiral begins and fantasy takes over and things can get bad. I've always believed that many, if not most, people work like that, and failed to understand those for whom it doesn't.

    In support of my feeling, I cite the obsessive discussion of anxiety in existentialist circles and the fact that most living creatures seem to live with it - have you ever watched a bird feeding, the continual pauses for a quick look round? - they are terrified. (Dogs seem mostly over-confident.) Evolution would likely favour a certain level of paranoia.

    So my question is the mirror of yours - how do people who don't get anxious cope with not knowing? Confidence can be soundly based, but nonetheless is liable to failure, so it seems to me that people who don't get anxious are living in denial or under an illusion or myth.

    Which is all off-topic, except perhaps to note that fear seems to rule many apparently confident and arrogant (dogmatic) people.

    Kantian dogma might be that set of beliefs which he thought were contrary to reason but which people believed mostly due to this hunger for something decisive where nothing decisive could be said.Moliere

    Very plausible. As to Kant's emotion life, I've always thought that anecdote about him going for his constitutional walk at exactly the same time every day spoke to obsessive control, which suggests strong and dangerous emotions. Anyway surely a passion for philosophy and devotion to the pursuit of truth are emotions as well as values - and strong ones at that? (Some would-be rational people need to be reminded of that, IMO.)

    they can be "tamed" to live a certain way.Moliere

    I prefer "balanced", but the crucial bit is the difference from repression and from indulgence. I suppose you know about the motto of the oracle at Delphi - "nothing in excess". Which can itself be overdone, of course. (Never forget about Dionysus - he'll come and get you if you do.)

    Don't we have a kind of understanding of emotions and values through our commitments and emotions we carry? Why do we need to understand these things at all?Moliere

    Yes and no. By which I mean that, as well as provoking and inspiring us, they sometimes puzzle or frighten us. Though, to be honest, I'm not at all sure what "understanding" means. Certainly, knowing about my hormonal system explains nothing, in the relevant sense.
  • Atheist Dogma.
    At least this is another motivation for the game of reasons that lives alongside the cooperative motivations. And the subjective, in relation to that motivation, is a position of vulnerability rather than invulnerability.Moliere

    The flexibility of all this is quite tiresome. Philosophers, at least, regarded the subjective ("introspection") as preferable because they thought it was immune to error - the same reason as their preference for mathematics. Aiming for something objective meant risk to them - something to be avoided at all costs.

    I think reason gets re-expressed and re-interpreted depending upon what we're doing rather than having it act like an arbiter or judge of the reasonable.Moliere

    You're right about that. But people do hunger for something decisive. Not knowing makes for anxiety.

    I think it goes like this : Given fear of death, fear of tigers and poisonous snakes is 'reasonable' in the sense that they are capable of causing death, whereas fear of mice is not. But as Hume famously didn't say, "you can't get an emotion from a fact". Fear of death is not reasonable, merely common. Lay on, Macduff, And damned be him that first cries “Hold! Enough!”unenlightened

    Well, emotions and values are ineradicable (saving certain ideas like Buddhism (nirvana) or Stoicism/Epicureanism (ataraxia)) from human life. We need to understand them whatever their status. Human life is a good place to start to identify what's valuable (and therefore to be desired or avoided, loved or hated, feared or welcomed. Where else would be better?).

    self-control: a peculiar notion which always feels contradictory to me.Moliere

    To me as well. It's just a manifestation of the preference for hierarchy. I think competing emotions and inability to decide (not necessarily irrational - sometimes there is no rational answer) are enough to explain the phenomena. No need for an arbiter.
  • Atheist Dogma.
    If you don't like it, you can appeal to the mods, whose dogma is final, subject to the terms and conditions of the service provider, that are subject to the various laws of the countries involved, subject to anyone giving enough of a damn to set about enforcement.unenlightened

    I take your point. Life is complicated, isn't it? However, your last clause hints at the basis of success for a dogmatic person. Keep the people quiet, because if they get really riled, you're in trouble.

    Your quotation is an excellent example of the genre. Lao Tzu always steers neatly between the bleedin' obvious and the intriguingly mysterious. Each element is perfectly clear, but why they are arranged like that is completely mysterious. But I guess you quote it because there is a connection with what we're talking about. Power, and its ultimate form, death, is the ultimate weapon of dogmatics; its limitation is that it only works when people fear death; when people lose their fear of death the dogmatic can wreak terrible destruction, but will lose in the end. Philosophers seem to hate talking about power in human society and loath to acknowledge its role.
  • Atheist Dogma.
    Hence one has recourse to dogma: "The referee's decision is final." Or the Supreme Court's, or the Central Committee's, or whatever.unenlightened

    Yes. Maybe this will be merely annoying, but there is a difficulty when we cannot appoint a referee. We look for a substitute - something that will determine the decision. This applies to truth, as in science. We look for facts, or we look to reason - even logic. It doesn't work very well. Hence fact and reason begin to get a bad name. Pity. There's no better authority.

    We can debate the meaning of any word, but only by not debating the meaning of the words we use to debate it. Thus even a debate on the meaning of dogma requires a dogmatic understanding of 'meaning', 'debate' etc. One might say that dogma is the (perhaps temporary) still, fixed point of the mind.unenlightened

    So, paradoxically, we are modifying what "dogma" has meant through most of this thread. Now, we are distinguishing between good dogma and bad dogma. I can live with that. I still reckon I can tell the difference.

    The thing is, "the still, fixed point of the mind" can change status and become the subject of a discussion. That's what preserves us from arbitrary authority.

    My thread, my rules; this is what dogma is, and this is my dogmaunenlightened

    :wink: That seems reasonable and I will defer to your judgement. If I don't like it, I can always go away.
  • Atheist Dogma.
    Surely with dogma, though, there'd have to be a shared other dogma which would allow for a third party to be relevant?Moliere

    That’s a problem. If there is to be a discussion, there needs to be shared ground. Wittgenstein would talk about shared practices and ways of life. IMO that’s not wrong, but too vague for specific applications. For example, discussing something is a practice (or, a collection of practices, since discussions can range from gossip and banter to legal procedures and rules of evidence to academic theories of different kinds). But it would be a start to say that the practice needs to be shared. (A practice does not need to be correct; it’s only wrong if it isn’t practical, in the sense of enabling the discussion) In one way, the practice needs to be objective, but we don’t really need that – inter-subjective or at least allowing space for each party - will do.


    But reason speaks differently to different people, and people are motivated by passion before reason so subjectivity has a way of coming back around even as we try our best to adhere to objective reason.Moliere

    Tell me about it!

    Subjectivity has a way of reappearing whenever we think we’ve got rid of it. Why do we want to get rid of it? Perhaps because objectivity is a way of making interesting disoveries and resolving disagreements, and we put quite a high (but not supreme) value on that.

    There’s a sense of reason in which reason “moves nothing” as Aristotle said. Hume’s version was, of course, the is/ought distinction. That means, as Hume pointed out, that reason is a slave to the passions – good only for working out means to the ends (values) set by the passions). (Spoiler alert – only in one sense of reason, IMO)

    In another part of the jungle, the is/ought distinction shows that theoretical reason is not relevant to the passions. But that doesn’t need to mean that they are irrational. There are reasonable fears and unreasonable fears, reasonable joys (winning the race) and unreasonable joys (preventing an opponent from winning the race – unreasonable because it undermines the point of the practice of racing.) (Actually, “reasonable” is useful also in theoretical contexts, when formal conclusive proof is not available.)


    Originally I wanted to have a kind of rule for classifying dogma, but this way of looking isn't really like that. It's probably better that way.Moliere

    Rules are fine – in their place. They are best developed after one understands the relevant practice(s). Sometimes, as in the rules of a game, we have a more or less free hand – what we say goes. But we are nevertheless constrained, if we want people to play the game, by what people find worth-while and/or amusing. In addition, rules can only be effective if there is agreement on how they are to be applied (i.e. in the context of a practice). It is important to be aware that every rule can (and mostly likely will, eventually) encounter circumstances in which the appropriate application may be unclear or disputed.
  • Atheist Dogma.
    While NIST is ultimately a maker of subjective definitions, they are inter-subjective and checked and about as good as you can get for those purposes. That's not the same as me claiming this or that brand of peanut butter is better though; we'd call that obviously subjective.Moliere

    I'm sorry, I can't decipher NIST. What does it mean?

    One could claim that one brand of peanut butter is better than another on objective grounds - that it is organic or doesn't use palm oil. Sure, the fact/value distinction would kick in, but the argument about whether those grounds are appropriate is at least not straightforwardly subjective. Whereas making that claim on the ground that "I like it" is quite different; that would be subjective. (But "I like it because it is organic" is different.)

    "Reputable", it seems to me has objective elements, because (in normal use) it would be based on reasonably objective grounds. The question would be about the worth of, for example, relevant social status (relevant professorship or other mark of success).

    So I'm just going to ask the obvious: Did we actually find a description of dogma that three of us are fine with?Moliere

    It looks like it. :grin:

    I accept that if we dig in to it, we'll find differences of opinion.
  • Atheist Dogma.
    The assumption seems to be that dogma makes for intolerance, but perhaps intolerance is more related to power, and dogma is simply 'certainty'.unenlightened

    I'm sorry. Those pronouns like "it" are very easy to misunderstand. This version is fine.
  • Atheist Dogma.
    Right... if we have reputable dogma then my dogma is good and their dogma is bad.Moliere

    Well, yes - if you don't have a definition of "reputable" that's not subjective.

    Which is succinct and manages to lay out what's meant. I'm understanding better what is meant by dogma at this point.Moliere

    I like unenlightened's first sentence. I don't understand the second.

    dogma makes for intolerance, but perhaps it is more related to power, and dogma is simply 'certainty'.Moliere

    Dogma includes "certainty", in the psychological sense. But psychological certainty is a trap, precisely because it leads to dogma and there's nothing like power for fostering certainty beyond what's reasonable.
  • Atheist Dogma.
    I don't think we should use "truth" here.Metaphysician Undercover

    Quite right. I was careless. I should have said, "I was thinking that if there is some validity in the madcap interpretation, it isn't madcap". However, doesn't "objective" means capable of unqualified truth or falsity? "Subjective" is more complicated. I think that some people would say ¬"subjective" means not capable of either truth or falsity, while others would say it means "true or false for someone".

    Any claim of such a "general or standard use" will miss out on a whole bunch of non-standard usage which is just as real as that contained by the general description.Metaphysician Undercover
    I didn't say or imply that non-standard uses of a tool are not uses. On the contrary, they clearly are.

    Making such a claim, is just a generalization intended to facilitate some argument. "The standard use of a hammer is to pound nails".Metaphysician Undercover
    Sorry, I was't careful enough, again. A normal claw hammer is designed and manufactured for people to pound nails (and to pull them out). (There are other kinds of hammer designed to pound other things.) Most people use their hammers most of the time for the designed purpose - they perform better than most alternatives. I agree that's an empirical generalization.

    You ought not think of meaning as in the head. It's far easier to understand meaning as being in the writing itself, but put there by the author.Metaphysician Undercover
    Well, I understood "in the head" to be metaphorical for "in the mind", which is itself a metaphor. To my mind, so is "in the text". But it is true that the text expresses the author's intention or even is what the author intended to write - curiously even if certain parts/features were not intended, but developed as the text was written. It all gets hideously complicated. I think the rest of that paragraph is OK.

    But I think you may be too restrictive if you are saying that the meaning of a text is limited to what the author intended. I don't see how anything can prevent other people from finding meanings (or quasi-meanings?) in the text which are not misinterpretations but which the author had not noticed. Plato was right - a text does not know who to speak to, but speaks to everyone equally.
  • Atheist Dogma.
    The point though is that I do not want to throw all madcap interpretations in the same trash-heap. As I said, the madman still expresses glimpses of insightful intelligence. And different madmen express different forms of insight. So their interpretations cannot all be classed together.Metaphysician Undercover

    I was thinking that if there is some truth in the madcap interpretation, it isn't madcap. But still, there is the point that interpretations may be mixed. Perhaps all interpretations will be found to be mixed. In any case, perhaps a trash-heap, as such is not such a good idea. Still, I'll want to know what to spend my time on. Difficult.

    Words are tools, and tools have no general "use", as use is a feature of the particular instance where the tool is put toward a specific purpose.Metaphysician Undercover

    No. Tools do have a general or standard use. It is true that bricolage can develop other uses, which may even become standard, but that doesn't undermine the point. I don't see why a particular view of interpretation should not be adopted in a particular context provided that practitioners are able to work with it.

    Then we have many options in between these two extremes.Metaphysician Undercover

    There's certainly a spectrum of the kind you indicate and important difference between "simple" cases and "complex" ones.

    But when we get to philosophy, the intent of the author is not exposed in this way. This is because the intent of the author of philosophy, the author's goal, or objective, is often actually unknown to the author. We can express it in general terms like the desire for truth, or knowledge, or an approach to the unknown.Metaphysician Undercover

    That exactly my bother about the "intent" criterion and why I can't accept the definition of a speech act in terms of intention. Plus there's the objection that "meanings just ain't in the head" - who was it who coined that?.
  • Atheist Dogma.
    A broken watch does not do what it is supposed to do, keep time, a madcap interpretation does what it is supposed to do, provide an understanding of meaning. The madcap interpretation is just different, in the sense of being outside the norm, so to make the analogy good, the watch would not be broken, but giving you the wrong time. In theory there would be a way to "translate" the interpretation, like relativity translates different ways of keeping time, because as a translation it must be ordered in some way and not completely random.Metaphysician Undercover

    "Giving the wrong time" makes some sense. I'm not sure in advance that all madcap interpretations provide an understanding of meaning. On the other hand, I can see that you don't want to rule out radically unorthodox interpretations in advance. Perhaps we should lump all madcap interpretations into the same trash-heap.

    I don't quite understand your last sentence. If it means that all interpretations must be mutually reconcilable, that undermines the point of different interpretations - unless the reconciliation is simply the original text, which all interpretations have in common.

    If it makes sense, it's plausible isn't it?Metaphysician Undercover

    That doesn't seem obviously true to me. Philosophy has produced several theories which, in my view, make sense, but aren't plausible. My dream that I can jump/fly over tall buildings makes sense, but isn't plausible.

    So the ancient person could very well be writing in a way which would appear incoherent to us today. Then the interpreter who tried to put things in coherent terms would be doing a faulty interpretation.Metaphysician Undercover

    Well, as usual, you have a coherent position. Revealing the incoherence of a text on its own terms is a perfectly coherent project. But would you say that Locke anticipated modern physics, or that Berkeley anticipated modern relativity theory?

    But to allow the condition of the modern society to influence how one interprets the intent of the authors would be a mistaken (subjective, because one's personal position would influence the) interpretation. The objective interpretation would be to look solely for the authors' intent, and not allow one's own intent to influence the interpretation.Metaphysician Undercover

    But can we always divine the intent of the author? We can't always discern the intent of even modern authors from the text alone. But I accept that the intent of the author, so far as we can divine it, is always important in interpreting a text. The same applies to the context in which they are written. But if that's the only correct way to read them, I'm left puzzled by the fact that some texts remain relevant long after times have changed, and we continue to read and discuss them. Your approach seems to consign all historical texts to a museum.

    I thought the starting-point of this discussion was the issues around the fact that there's no single authoritative (privileged) interpretation.

    A better example probably is the ongoing discussion around the second amendment in the US constitution, the right to bear arms.Metaphysician Undercover

    You are right, that is a better example.

    Therefore instead of looking to change them it just becomes a question of the intent behind them, and how to apply that same intent today.Metaphysician Undercover

    Fair enough. But the catch is "how to apply that same intent today". That means interpretation in a context the author(s) didn't know about. There's a narrow line there between divining the intent of the author and speculating.

    The objective interpretation would be to look solely for the authors' intent, and not allow one's own intent to influence the interpretation.Metaphysician Undercover

    I'm not well-informed about jurisprudence, but I believe that the Supreme Court in the UK has a rule that the intent of Parliament does not determine the meaning of the Act; it will only consider the words on the page. There's a notion of objective meaning at work there which philosophy would find troublesome, but nonetheless, lawyers seem to be able to work with it, and if meaning is use, that validates the principle, at least in the context of the law.

    dogma as a relationship between beliefs, which would be partially content-dependentMoliere

    I certainly agree that dogma is a relationship between beliefs, in that dogma is in some way protected against refutation, with the implication that other beliefs can go to the wall. But that status is attributed by the believer, so I don't see that I can delineate any content in advance.

    so insisting that space is infinite, for instance, is dogmatic due to the place that "space" fits within the scheme of reason.Moliere

    Yes. Kant is using "dogma" in its traditional, non-rhetorical use. Which is not wrong, just very unusual. One of my problems here is precisely to distinguish "respectable" dogma from the disreputable kind.
  • Atheist Dogma.
    It's a luxury for me to say it, but it still looks to me like religion as such is not the problem, but the social and geopolitical situation in which religious divisions take on greater significance than otherwise.Jamal

    You may well be right. So I will retreat to saying that it depends on the details of the case and I won't argue about what "significance" means. I assume that if the people involved find religion significant in their context, it is significant.

    This is an interesting method for determining dogmatism! It is interesting because the content of beliefs isn't referenced at allMoliere

    I'm glad you find it interesting. Now, I'm interested that you think that the content might be relevant. I never considered the possibility, because you find dogmatists everywhere. Atheists, priests, philosophers, football fans, etc. One could look at the status or role of the belief. But I'm reluctant to call axioms or "hinge" (or similar) propositions dogmas even though they are beyond argument, because they can be evaluated indirectly, through the system that results.
  • Atheist Dogma.
    That's hilariously in character -- Disagree with me? Why, you must not understand!Moliere

    :grin: But seriously... there is another variety of dogmatism, which is not quite the same. It starts from exactly the same response - "you must not understand me.", but does argue, properly at first. But when it becomes apparent that the proposition at stake will not be abandoned, (for example, as in ad hoc explanations), the debate is over - unless one can agree on a solution such "hinge proposition" or axiom, in which case a solution has been reached. Those solutions are a bit of a problem.

    The key, though, is that proper engagement requires that one put one's own beliefs at stake.

    Why must there be such limits? A madcap interpretation is still an interpretation.Metaphysician Undercover

    That's true. I'm happy to accept that a madcap interpretation is an interpretation, but only in the sense that a broken watch is a watch.

    Incidentally, this is very evident in fiction, one must allow the author to describe the environment, and the reader must allow oneself to be transported to that environment, leaving one's own. In school we start by learning fiction, and it's good practice.Metaphysician Undercover

    You are quite right, of course. But fiction is a particular context. Even so, Aristotle says that a story must be plausible. I think that's too restrictive, yet there's something in it.

    Can I give the same liberty to, say Berkeley's immaterialism/idealism? Assuming that it is a consistent and complete system on its own terms, I could have no objection. Could I object to Putin's interpretation of the history of Eastern Europe?

    Another example (legal in this case) based on ancient memories of "The West Wing". Suppose a country has a constitution written more than 200 years ago. There is a provision that each geographical division of the country should send to the legislative body an number of representatives proportionate to its population. It is taken for granted that women do not count. It is further provided that slaves shall count as a fraction of a person (say 2/5th). Fast forward to the present. It is clear, isn't it, that something must be done. No-one is a slave any more, so perhaps that provision can be simply ignored. The provision about women was so obvious that it is not even mentioned, so perhaps one could simply include women. But it would be safer to delete the slave clause and add a definition of "person". You might not count that as re-interpretation, but it surely demonstrates that it is sometimes necessary to take account of the contemporary context as well as the historical context.

    It's a luxury for me to say it, but it still looks to me like religion as such is not the problem, but the social and geopolitical situation in which religious divisions take on greater significance than otherwise.Jamal

    Basically I agree with you. But the local religion is also part of the social and geopolitical situation. So perhaps it might be more accurate to say that religion is only part of the problem, or one factor in the problem. Or, perhaps still more accurate, that the local interpretation of the religion is a factor in the problem.
  • Atheist Dogma.
    Logic is designed to be context independent, that's the beauty of it.Metaphysician Undercover

    I was please that you like my previous post. I have to say, you have a way of putting things that I simply cannot help responding to. And it seems, we are capable of conducting a dialogue. It's not every day that one finds that.

    To the quotation:- Well, yes. But then, it is a context, if a special one. I don't want to get trapped into defending my use of "valid" - which, by the way, has uses in many context apart from logic. I was trying to say that not every madcap idea counts as an interpretation. There are limits. The text is flexible, but only up to a point.

    As to your primary and secondary context, I think we need a few more. The author's environment, social, physical, intellectual, etc. is certainly one context. The readers' environment is another one, and of course that may break down into a number of sub-contexts; it may overlap, to a greater or lesser extent with the author's environment. Finally, there are the multifarious contexts of posterity. This is relevant because when the text is read in a different context different questions, issues, priorities may come up and lead to a need for interpretations that go way beyond anything the author could have meant or thought. But still, it is not the case that anything goes.
  • Atheist Dogma.
    Maybe that's the better route towards understanding dogmatism critically.Moliere

    It's a question of one's attitude to others. Subject to the paradox of tolerance and provided tolerance doesn't mean one cannot listen to others and take them seriously, your route seems the only tolerable option.

    I once knew someone who was passionate about the Enlightenment. Unfortunately, he took this to mean that when someone disagreed with his argument, he should repeat the argument. He was perfectly patient, never dogmatic, but never responded properly. He was dogmatic, but not offensive - just boring.
  • Atheist Dogma.
    I was just making a mild jokeTom Storm

    Point taken.

    Not recognizing a bit of fun when I was talking about the role of fun in philosophy is a bit of a mis-step.

    I tend towards anti-foundationalist skepticism myself.Tom Storm

    I'm very taken with Hume's distinction between excessive scepticism and moderate scepticism. He condemns the former and recommends a dose of ordinary life as a cure, but recommends the latter as the best approach to life, including philosophy.

    PS added later. Hume describes moderate acepticism as "judicious" which I think is a splendid and spot on. I couldn't remember it when I wrote the last paragraph.
  • Atheist Dogma.
    I think this evasion or deflection happens in science just as it does in religion.Janus

    That's fair. There's a very fine line between parking the question what burning (as in fire) is when you are an alchemist and don't have the theoretical context to explain the phenomena (which turned up eventually in molecular theory) and dodging the issue, as when Aristotelians ended up characterizing matter as pure potential.
  • Atheist Dogma.
    or more precisely the belief that there is a correct interpretation, which is the incorrect interpretation.Metaphysician Undercover

    Yes. But I don't think that anything goes. "Valid" is the word I think of as correct.

    Validity depends on context. By asking different questions, one sets a context. There's an old question about whether Epicurus anticipated modern atomic theory. For me, the answer is no, since he didn't know modern science and his atoms are very different from ours. Not everyone feels the same way. But I don't argue with them. I just ignore them. Again, some people think that Berkeley anticipated relativity theory. There are striking resemblance and connections, but I think that "anticipated" is far too strong. Our relativity is very different from his.

    The complication comes with "meaning". In ordinary language, we do get involved with what the speaker/author intended; we divine those by the context. If I'm a soldier on parade, the words of command mean (intend when uttered) a precise response. Alternative interpretations are frowned on. Flexibility of interpretation is appropriate in response to the kinds of case that we have been talking about, but that's a different context.
  • Atheist Dogma.
    Yes. I am a reluctant post-modernist.Tom Storm

    That's a pity. You're missing out. The original guys enjoyed it. (The dialogue between Searle and Derrida is a good example.) It was having a sure-fire way of tweaking the lion's tail - where the lion was the orthodox academy. The sense of fun that I found in them was part of the appeal. (I also realized that it must have been part of Socrates' appeal when he revealed Socratic method to his friends. I suspect that it was one of the reasons he lost the trial.)

    once a work is in the public domain, anyone can bring anything to it, put it to any use and make their contribution as important as or more important than the original and turn it into something quite else from what it was intended to be.Vera Mont

    I wouldn't go as far as that. It's probably true to say that one cannot limit in advance what interpretations might be found in a text. But I think there is a distinction between valid and invalid, difficult though it is. Could one find an interpretation of Hamlet that saw him as a man of action? I would take a lot of persuading. I hope I'm not being difficult.

    This experiment demonstrates very clearly that it is possible for an author to not know what one intends to write, when it is written.Metaphysician Undercover

    The experiment does show that a text can have meanings that the author did not intend. So does the practice of improvisation in music. But that's not quite what's at stake - or so I thought. What was at stake was whether a text could have meanings that were not intended, despite the writer having different, even incompatible, intentions - or rather, whether it is legitimate to attribute to the text meanings that the author did not intend. In one way, that is clearly possible, but we often think (in other contexts) that such attributions are misinterpretations. If a teacher says "That's all" because that's all there is to say on the topic in hand, and the class leaves the room, it might well be a misinterpretation if the teacher was merely moving on to the next topic.
  • Atheist Dogma.
    Pointing out that snakes cannot talk in response to a non-literal interpretation of the fall of man really seems to miss the point.Moliere

    It certainly does. If it does anything, it emphasizes difference in the interpretation of "interpretation". The difficulty is that sometimes interpretations sometimes exclude each other - or seem to. They certainly reflect different presuppositions and different interests.

    I suspect two different uses of interpretation here. One is a use in which interpretations do not exclude each other; each is valid or invalid on its own terms. The other is a use in which a rule is applied to a case. (Yes, I'm channelling Wittgenstein). Each application of a rule is an interpretation, so it may be applied in different ways. Sometimes, we can agree that the rule might be applied in different ways; then we seek a "ruling". But if the rule is to have any meaning, we need to be able to say that one way of applying the rule is right and another is wrong.

    It seems to me that the conviction that one has the right, correct, true answer is the source of dogma, and consequently the most pernicious view. I don't think that atheism or religion are necessarily pernicious, it is the conviction that does the harm.

    Yet, if there is any truth to be found in this chaotic world, and even if there is none, one has to take a stand somewhere. How can one do that and avoid becoming dogmatic?

    I might have gone in wanting to say X (and partly achieved that) but what the story really demonstrated is Y.Tom Storm

    That might be a surprise, but, so long as X and Y are compatible, not a problem. Surely it's only a problem if X and Y are not compatible. Your use of "really" suggests that's what you have in mind. That's a situation that post-modernists particularly enjoy(ed).

    My experiences of writing philosophy include the slightly weird experience of finding an argument taking charge and leading me down a path I didn't intend to go down and don't want to go down.

    It's always worth understanding what the author's intentions were (or might have been) and what a text means (or might have meant) to the author's audience (i.e. in the relevant social and cultural context). But sometimes people forget that many texts are read and are important to audiences far beyond their original context The question of interpreting them in those circumstances must go beyond their origins. Indeed the problem starts to arise as soon as the text is published.

    (Plato was scathing, in the Phaedrus about written texts for exactly that reason. He ("Socrates") says (from memory) that "they do not know to whom they should talk and when they should be silent.")

    The only answer apologist can give is "God moves in mysterious ways": which is not even close to being morally satisfactory.Janus

    Quite so. That's the classic. When I first started asking awkward questions, I was told that "we don't worry about those questions". That produced the same result. I went and asked the questions where people do ask them - mostly in philosophy, with the obvious result.

    I prefer what scientists do. They file the question under "pending", basically meaning "to be worked out later". That's the undogmatic response.
  • Atheist Dogma.
    Which complicates identifying someone else's dogma even more!Moliere

    Thanks. There's no smoking gun. One sign may be an undue willingness to find other people's opinions dogmatic. Another is undoubtedly avoiding engagement with the opposition's arguments (without good reason). But nothing is simple. On the whole, I prefer to avoid the term. It is used far too often as rhetoric - giving a dog a bad name.
  • Atheist Dogma.
    But this claimed 'loss of freedom' would have to be justified in a global system where all stakeholders can take their basic needs for granted, for free, from cradle to grave.universeness

    There are two problems with this perfectly reasonable idea. Both are already at work in our world. I don't argue that the project is hopeless, only that the dimension of effective enforcement is critical, and that the tension between resolving problems within a legal and democratic framework and the exercise of force is inescapable.

    I live in a country that adopted precisely this principle some 75 years ago. Ever since, nearly everybody had accepted it. But the welfare state had been a battle-ground over the question what "basic needs" are. One party tends to squeeze and erode it, the other tends to support and extend it.

    By the way, the welfare state is not a matter of left vs right or socialism vs capitalism. It began in 1883 when Bismarck introduced the first welfare state legislation in Germany/Prussia. This was no socialist programme. It was implemented by aristocrats who recognized that it was the best way to keep the working classes in line. But perhaps you know that.

    The idea of human rights, articulated and supported by a legal framework, has been a reality ever since 1945. There's an on-going debate about what exactly they should be. But powerful lobbies, religious and political, have never really accepted the idea and they are able to repress demands for their effective implementation over a very large proportion of the world.

    Maybe consensus and acceptance of enforcement will be possible one day. I would love to be around when it happens, but I don't think I will.

    Thanks. By sacrifice I meant the temporary death of Jesus, the 'blood sacrifice'.
    — Tom Storm

    That whole aspect of Christianity has never made any sense to me either.
    Janus

    I agree. But I think it is not just an odd doctrine. It seems to me to be actually immoral to destroy an innocent life in order to escape from guilt, (even if the victim volunteers). Once the sin has been committed, nothing can alter that fact. There are various things, practical and symbolic, we can do in order to go on living, but what really amounts to a resolution of the problem is a mystery to me. Time's a great healer, I suppose.
  • Atheist Dogma.
    Notice though, that this ultimate end is not susceptible to rationality, because it cannot be transformed by rationalization into the means for a further end, and this is what is required to make it rational.Metaphysician Undercover

    It seems, after all, that we do have similar aims - escaping from the infinite hierarchy. That has to be promising.

    I'm afraid I find myself a bit confused and lost amid all the messages. I've taken a screenshot of one of your messages which seems to explain what you're after. I shall take some time to read it and think about it.