In answer to the second question, the short answer is no. In order to count something as visible it is only necessary to demonstrate that it is capable of being seen. However the best, and arguably only conclusive way to demonstrate that something is capable of being seen is to see it.Must the world be understood in order to be intelligible (able to be understood)? As an analogy, must something be seen in order to be counted as visible? — Janus
I’m not clear what this means. Presumably, you mean that to say “the universe exists” is to say that the universe exists, but not to say that any of the others exist. Fair enough. I don’t see any implications for mind-dependence or not.To say 'the universe exists' is actually to say 'this universe exists' and not the others. Why? Because we observe it.
Your etymology is not wrong. But arguments based on etymology are very weak because words can change their meaning over time. I don’t think “exists” any longer means “stands out” in any sense that is relevant to questions about mind-dependence or not.The word 'exists' has its origins to mean 'stands out' which often implies that there is something to which it stands out. Hence it stands out to humans of course, making the world all that is particularly relevant to humans. That makes any asserted existence seemingly pretty mind dependent.
Well, it is true that if we perceive something, that something usually exists. That’s part of the meaning of “perceive”. But most often that something exists quite independently of the label and quite independently of the perception. So there is nothing to the point here. (When we think we see things that do not exist, we use different language – concepts like mistake, hallucination, delusion and illusion.)I'm just noting that human biases tend to slap on the 'real' label to that which is perceived, and resists slapping that label on other things, making it dependent on that perception.
In principle, this is an interesting criterion, which could work, at least in standard scientific contexts. The original formulation in Plato’s Sophist) goes something like “Anything that exists is capable of affecting other things, and capable of being affected by other things.” But it works in favour of mind-independence of anything that it applies to. Your argument to adapt it to show the opposite is very weak, because you admit that there are different ways to formulate it. I;m afraid that in any case, phrases like “counted as” do not imply mind-dependence, at least as I understand it."Principle 1 (The Eleatic Principle) An entity is to be counted as real if and only if it is capable of participating in causal processes" This wording of the principle is almost mind independent except for the 'counted as' part, and I've seen it worded without that.
A criterion does not normally affect the independence or otherwise of what it is a criterion for. When they changed the criterion for a planet so that Pluto was no longer a planet, Pluto was totally unaffected.Colyvan quotes Keith Campbell in his paper, who notes a similar thing:
"This search for a criterion for the real must be understood as a search for a criterion for us to count something as real ...*
I agree with that. But that’s because philosophers use the word in a very peculiar way which generates all sorts of fake puzzles. Normal people know perfectly well what the critical marks are of real coins, real diamonds, etc. What philosophers seem unable to stomach is the fact that the criterion for “real” depends on what you are talking about.There need not be, and probably cannot be, any critical mark of the real itself; the real is what is, period."
For my money, the “neat singular classical reality” was always an illusion. Quantum Mechanics, in this respect, was knocking at an open door. But I don’t see anything that clarifies mind-dependence or not.Quantum mechanics also contributed to the demise of a nice neat singular classical reality. A third principle to consider is one that QM definitely brings into question.
You have noticed that I am cautious. That’s true (most of the time). So, with due caution, that looks like something I can accept. Apart from deleting the word “fully” in “fully real and determinate”. I don’t know what that commits me to and suspect it may be a bit rhetorical.While nobody would disagree that the mind plays a role in cognition—supplying the conceptual framework, perceptual integration, and interpretive acts by which we know—they would nevertheless retain an innate conviction that there exists, in the background, a world that is fully real and determinate independently of mind. This is what I see as the import of metaphysical realism and that is what I am seeking to challenge. — Wayfarer
I’m clear that intelligibility is something that is constituted (“created”?) in the interaction between mind and world. However, our understanding of the world tells us that it has not changed in any radical way since we appeared and that many of the processes now going on must have been going on long before any sentient or intelligent creatures appeared. So is it not reasonable to infer that the world would have been intelligible if there had been anyone around to understand it? (Note that this is a counter-factual, not a blunt assertion.)…. the very structure of the world, as intelligible and coherent, is constituted in and through the relation to mind. Not an individual mind, of course, but the noetic act—the perceiving, structuring, and meaning-bestowing – that makes any world appear in the first place. — Wayfarer
I love etymology and the history of words (and concepts). It is important in its way, and sometimes is relevant to philosophical understanding. But words change their meanings over time. So the relevance of etymology is always in need of demonstration. I’m afraid that, in this case, I don’t think that the etymology is particularly helpful.(Relevant to note that the etymology of 'world' is from the old Dutch 'werold' meanig 'time of man'.) — Wayfarer
There is indeed something very odd about the concept of "an object that exists but is wholly outside any possible disclosure".But the claim I'm advancing would point out that what “it” was prior to its discovery is not just unknown, but indeterminate. The very notion of “an object that exists but is wholly outside any possible disclosure” is, I suggest, an imaginative construction. It is an extrapolation or projection. — Wayfarer
I’m all for co-arising of “reality as it shows up at all”. Reality is a different matter. Much of reality has not shown up yet. Yet it is true that we expect our ways of understanding the world as we know it to apply to the bits of the world that we do not yet understand or even know about. If perchance our current ways of understanding the world turn out not to yield what we expect, we work out new ways of understanding – in the process, we are prepared to abandon what seemed to be important parts of our existing understandings, extract whatever we can from the data and work out new ways of understanding it. So what it would take for us to acknowledge that we do not, and cannot ever, understand some new phenomenon, I cannot imagine. (I’m thinking of quantum physics and relativity, of course. But the Galileo/Newton revolution was, in its way, very dramatic indeed – it’s just that we’ve got used to it.I’m arguing that the world as a coherent totality is incomprehensible outside the structures of consciousness. It’s not that the mind projects onto a blank slate, nor that it merely filters a pre-existing reality, but rather that reality as it shows up at all is a co-arising: dependent on the mutual implication of mind and world. — Wayfarer
Yes, what we know is “bound” to the mind. How else could it be known? But one of the things we know is that there is much that we don’t know; it is reasonable to think that what we don’t know is not “bound” to the mind.…. intelligibility is not something we add to a blank canvas but something that arises with, and through, the encounter of mind-and-world. — Wayfarer
If something is there, it must be part of the “world”. It certainly will be when we find out what it is. On the other hand, “form, object, existence and so on” are certainly not transparent labels (any more than “world” is, especially since recent developments in physics). Anyone who looks carefully can see that. (Philosophers don’t always look very carefully – they are too often in a hurry to get to some huge vista or other.)The critique that “the world exists anyway” misses this crucial nuance. Of course, something is there. But to designate it as “the world,” or even as “something,” already presupposes the categories of thought—form, object, existence, and so on. The realist mistake, in my view, is to treat these categories as transparent labels for things that are "there anyway", failing to recognise the way the mind categorises and situates them, without which they would be unintelligible. — Wayfarer
Yes, form is what makes something intelligible. On the other hand, I think that Aristotle calls the form “what it is to be” something (a.k.a. essence) and believes that whatever it is is mind-independent and yet is required if things are to be intelligible.Aristotle posits forms as intrinsic to particulars, but in a way that already implies a kind of noetic participation—form is what renders a thing intelligible, it is how we know what it *is*. — Wayfarer
Well, yes. We can’t meaningfully call anything real if we don’t exist. That does not justify saying that there is “no world at all without mind”.…my position …. is that there is no world at all without mind—not as a subjective opinion, but as the condition for appearance, for disclosure, and for anything we might meaningfully call real. — Wayfarer
I agree that the subject, the observer (and, sometimes, intervener) should not be lost sight of and that the vistas disclosed by science are astounding. You’ll think that I’m a bit of a heathen, but I’m just not convinced that scientific knowledge – and still less, physics - is the whole of knowledge or that science has a monopoly of astounding vistas.And finally, the reason this matters is so we do not lose sight of the subject—the observer—for whom all of this is meaningful in the first place. The scientific, objective view is essentially from the outside: in that picture, we appear as one species among countless others, clinging to a pale blue dot, infinitesimal against the vast panorama that scientific cosmology has revealed. But it is to us that this panorama is real and meaningful. So far as we know, we are the only beings capable of grasping the astounding vistas disclosed by science. Let’s not forget our role in that. — Wayfarer
That sounds like my cup of tea. But which article exactly.For a whole boatload of -isms reflecting the confusion this nonsense brings, see the SEP article. — Mww
What if a thing is the cause of a sensation?Riddle me this: do we seek to know a thing, or do we seek to know the cause of a sensation? — Mww
I'm very glad to hear it.I know what a basketball is, but trust me when I say there isn’t and never was any such thing in my head. — Mww
It all depends on what you mean by "know".on the other side of a very large coin, why we know things are not entirely dependent on our minds, is because it is not things we know, from which follows nothing of a thing is dependent on our minds. — Mww
Well, most philosophy is fun, but some philosophy is more fun than the rest. Unless you are a professional. For a professional, the question is which philosophy gets you paid.Havin’ fun yet? — Mww
Cognitive disorientation: the empirical kind, a posteriori, and properly reduced, occurs when we say we know what a thing is but we don’t realize it is not the thing but always and only the representation of it, to which such knowledge expression relates. So yes, you, and everyone else, is a victim of it, but it isn’t an experience, as such. It is the mistake of conflating the occurrence of a cognitive method with the post hoc ergo propter hoc expression of its functional terminations.
Some folks like to quip….the universe doesn’t care what the human thinks about it, it is what it is. Compounded categorical errors aside, it is at least consistent to quip that human thought doesn’t care what the universe is. It remains the case that the universe, or, with respect to empirical knowledge, the objects contained in it, can never be comprehended as anything but that of which the human mode of intellectual determinations prescribes. Why these should be considered incompatible with each other, is beyond reason itself. — Mww
Quite so. No-one except Nietzsche seems to have spotted the distinction between compulsion and determinism.An you do have the opportunity to act otherwise. Brains were evolved to make better choices, which wouldn't work at all if there were to choices available. Determinism shouldn't be confused with compulsion as it often is in these discussions. — noAxioms
There's something we agree on. I'm offering you some observations that suggest, at tleast to me, that the question of mind-dependence is much more complicated than you seem prepared to recognize. What's wrong with that?It isn't wild guessing since the rule needs to be consistent with what we do observe, and the opinions of most people don't meet that criteria, per the OP. — noAxioms
If unicorn-like creatures exist anywhere in the universe, their similarity to the unicorns we know and love is entirely coincidental and proves nothing. That argument is a side-issue. That pattern of argument can be used to prove the existence of anything that you can imagine. It makes the idea of distinguishing between what does and doesn't exist meaningless.The unicorn, as a specific case, should of course be 'I don't know'. So an educated estimate might be in order, which is not wild guess. — noAxioms
Three dimensions for space plus one for time, makes four dimensions. So all rocks are 4-dimensional. Perhaps you mean 5 dimensional? In which case, you'll have to ask someone else.How about a 4 dimensional rock? That's not going to be part of 'the universe', so either you pick a rule that says it doesn't exist, or pick one that doesn't confine existence to 'the universe', or perhaps, 'the universe now'. Once we have a rule, we analyze it for mind dependence, and per my argument, anything that mentions 'the universe' is probably going to be mind dependent, unless one defines universe far more broadly with 'all that exists', in which case one is left wondering if we're part of that. — noAxioms
I do put the moon and the thermostat in the universe. I'm less sure about unicorns. They are mythical creatures, so exist in our universe. But since they are mythical, they do not exist. It's complicated - either answer is justifiable.2 is 'part of the universe'. You probably put the moon and thermostat in the universe. I consider the universe to be sufficiently large to leave little probability of the absence of a unicorn anywhere. Hence same classification. 3 is trickier since it needs to relate to me, so perhaps the unicorn isn't close enough to do that. — noAxioms
You may be suffering from delusions of grandeur. Mt. Everest's existence was not caused by the people who climb it or by the people who worship it and mathematical objects like numbers, it would seem, do not exist at all.As for mind-dependence, we call our universe 'the universe', making it privileged because we see it. That makes it a pretty observer dependent definition of existence. 3 is not observer dependent, but depends on causal relationships. Existence is thus only meaningful within structures that have them. — noAxioms
Indeed. If there are any. It seems to me that very little is agreed in those fields, so perhaps it is premature to think that any secure conclusions can be derived yet.OK, so you don't have a physics background. Makes it harder to discuss relativity and quantum implications to this topic. — noAxioms
I can accept that. It doesn't mean that the objects that we make judgements about are mind-dependent. That would be confusing the framework with its contents.It’s rather that, whatever judgements are made about the world, the mind provides the framework within which such judgements are meaningful.
Yes.What their existence might be outside of any perspective is meaningless and unintelligible, as a matter of both fact and principle.
I suppose so. Mind you, I'm not happy with the term "Reality". It seems as if it means "Everything that's real". But some things can be unreal under one description and real under another. So reality and unreality are inextricably entwined, which make "Reality" a rather unhelpful term.Reality has an inextricably mental aspect, which itself is never revealed in empirical analysis.
Well, yes. It's an exciting time in neurology, no doubt about it. But let's not go overboard.By ‘creating reality’, I’m referring to the way the brain receives, organises and integrates cognitive data, along with memory and expectation, so as to generate the unified world–picture within which we situate and orient ourselves.
Well, yes. What we know is "bound by and to the mind we have (are?)". It wouldn't be our knowledge if it were not so. But that doesn't show that the object of our knowledge is "inextricably bound by and to the mind". Indeed, one of the things we know is that many things are not bound to our minds at all.But what we know of its existence is inextricably bound by and to the mind we have, and so, in that sense, reality is not straightforwardly objective.
... except, of course, that nothing can be said (or known) about it. But that's an annoying argument, so I won't press it. I think I understand what you mean about the idea that things go out of existence. "Neither exists nor does not exist" must be based on "What their existence might be outside of any perspective is meaningless and unintelligible" (?) There is indeed a certain puzzle about saying whether something that we do not know of at all exists or doesn't. The catch is that has literally nothing to do with the question whether it did or not.The idea that things ‘go out of existence’ when not perceived, is simply their ‘imagined non-existence’. In reality, the supposed ‘unperceived object’ neither exists nor does not exist. Nothing whatever can be said about it.
Disorientation is a good way of characterizing philosophical problems. But I don't experience that here. Can you tell me more about it?We designate it as truly existent, irrespective of and outside any knowledge of it. This gives rise to a kind of cognitive disorientation which underlies many current philosophical conundrums.
I can't comment on either Kant or Pinter. But it all depends what you mean by "form". I could be wrong, but I am under the impression that Aristotle and many others were quite happy to posit forms as existent in things whether or not anyone knew about it. Even Plato allowed that existing things "participated" in the relevant forms.TWe mistake the form discovered by the mind as something that is there anyway, not seeing that the mind is the source of it. Kant 101, as I understand him. — Wayfarer
Yes - "too compromised" means "not working as it should or normally does." If their mental state was normal, we would hold them responsible. Yet a deterministic account cannot point to any significant difference between those states. Compromised state and uncompromised state are all the same to it. So our judgement is made in a different framework or category. In practice, when people are behaving normally and their mental state is not compromised, we do not bother with the causal, deterministic level of explanation. We only pay attention to it when things have gone wrong, and those normal explanations don't apply.The reason why we do not attribute guilt to those who are considered 'not guilty' by reason of insanity it is because we do not think they have been able to act otherwise. Their mental state was too compromised. — boundless
Knowledge, you will agree, is mind-dependent. Outside of knowledge of the object, the object neither exists nor doesn't exist. This is elaborated in The Mind Created World, if you're interested in further discussing it. — Wayfarer
Then I'm afraid I don't see what you are getting at.No, I'm not also saying that. — noAxioms
I'm afraid I'm lost again.It absolutely does apply. The justification given for its nonexistence gates whether the chosen stance is valid or not. It's the core point of this whole topic. — noAxioms
This doesn't help me at all.A quite simple model might say that both exist, the unicorns just being somewhere else where we don't see them. That example shows that there can be a single model that applies to both. Another is that unicorns don't exist, but moon does. That's likely more popular, but it isn't specified why the model declares unicorns to be nonexistent, so it's incomplete. — noAxioms
I don't understand any of the above.I've never required us to know about them. This is a model, not proof of existence or not. The topic is not about epistemology. We can't know if the unicorns exist or not, and we certainly can't know if our chosen model is sound or not, but we can at least come up with a valid one. — noAxioms
A general rule would be good. But how can one work that out without looking at specific cases? Rule first is just wild guessing. You'll have to come back to assessment of specific cases after that. So why waste time?You got it backwards. The general rule is what I'm after. The unicorns end up on one side or the other depending on the rule chosen. Rule first, then assessment of unicorn or whatever. — noAxioms
I don't understand how 2 or 3 applies to all three and I don't see how that classification tells me anything about their mind-indendence.OK. Sounds like the beginnings of a complex model. I would have probably classified moon, unicorn, and thermostat in the same category of either 2: Part of this universe, or 3, relational. — noAxioms
That's not quite what I meant. I have no idea what spacetime would look like and even less idea what a picture of spacetime would look like. We seem to be agreed that what we actually have is a diagram, not a picture.You compared my suggestion of a spacetime diagram to a picture of the same subject, presumably from some point of view. — noAxioms
So far as I can see, and I may be wrong, many, if not most, philosophers are compatibilists and are trying to cash that out by re-conceptualizing the problem. To put is another way, the approach is that both traditional free will and traditional determinism are interpretations of the world. If they jointly produce absurdity, we need to think of both differently. Have a look at Wikipedia - DeterminismAre you saying that we can still 'believe' in free will even if all empirical evidence goes against it because, even if there is no free will, we can't be certain of it? To me that would be self-deception. — boundless
I don't understand why you say that. It we did not have the equipment, we would not be able to carry out the process, and so would be unable to see what would still be there.Without that conscious and unconscious process of data reception and synthesis, there would be no world to see. — Wayfarer
Of course our knowledge is "grounded in our mind's eye", but that doesn't mean that the things we know about (most of them) would vanish if our mind's eye or even the eye in our heads did not exist. Knowledge is not existence.But that knowledge is still grounded in our 'mind's eye', so to speak - even our knowledge of what it is. — Wayfarer
I don't know what "realism" does. But I do not deny the role of the mind in the process of perceiving the world. I just deny that the world would cease to exist if our minds etc. ceased to exist.Realism neglects the role of the mind in this process. It takes the world as given, without considering the role the mind plays in its construction. That is the context in which the idea of mind dependence or independence is meaningful. — Wayfarer
I like that. Yes, one of the ways that we can tell what the real world is, is by the way it smacks us in the face if we do not pay enough attention to it.Reality constantly smacks us in the face with a two-by-four with contradictions daily to what our mind wants to believe is true. — Philosophim
Basically, I do agree with you. I even agree with you that we usually have not grasped what we know accurately. But I think there are things in our world that are "mind-dependent" as well as those that are "mind-independent". The difference matters, because it maps the limits of what we can change. It would be a useful piece of philosophical work to chart the difference, if only we could set aside our hunger for grand generalizations.What we know is clear: There is a world independent of our own minds. — Philosophim
Are you also saying that there is no connection between those two facts?But I said 'share the same ontology' without saying what that ontology is. I also somewhat misspoke, since a presentist would say the moon 'is' while the Theia event (where the moon is created) 'was', a different ontology. — noAxioms
I don't see that things that don't exist are relevant here. Mind-independence doesn't apply to them.OK, so pick something that doesn't exist, and justify that. Or pick something that exists outside of experience, and justify that. That's what I'm looking for in this topic: Somebody who can come up with a consistent model of mind-independent existence. — noAxioms
How could we possibly know about things that exist independently of our minds without observation? The role of the senses is precisely to give us information about the world outside or beyond our minds.But when pressed, it seems that everybody's limits of what exists or doesn't relies on things gleaned through observation. — noAxioms
I don't see why you would think that what I would say about the existence of unicorns can be generalized to everything that exists. The speculative argument does not bring anything into existence, so it is no ground for thinking that anything is mind-dependent. However, I do agree that notions and concepts and ideas are mind-dependent (mostly). But it does not follow that the objects of notions and concepts and ideas are necessarily mind-dependent. The moon is a case in point. We have an idea of something that exists quite independently of human beings.You're missing the point. ... I'm not trying to-argue that unicorns exist (or don't). I'm trying to argue that your notion of what exists is a mind-dependent one. — noAxioms
You're right. I made the mistake of picking the criterion that seemed closest to what I think. I don't really think that there is a general definition of existence. What existence means depends on the kind of object your are talking about. So there is one criterion for the moon existing and a different one for unicorns existing; the criteria for thermostats are different again. The criteria for existence are truth-conditions, so are not themselves true or false.Definition 4 totally discards truth value. 2 can have a truth value even if it's a relative truth. 2 boils down to [is a member of a preferred set, and members of other sets don't matter]. — noAxioms
I'm a bit puzzled about what you mean by "the picture" here.Funny then that I find the picture less like reality and more like an abstract interpretation. — noAxioms
Thanks for that. I did look at it, and it was interesting. But I'm simply not competent to comment.Bell's theorem (and not just 'theory') demonstrated the impossibility of local reality almost 60 years ago.
— noAxioms
↪Ludwig V
It was the 2022 Nobel Prize in Physics which was awarded to the experimentalists who proved it. (I wrote an article on it for anyone interested). — Wayfarer
This is a bit of a distraction. However, let me say that I think that most philosophers do actually decide to live with the dissonance. Perhaps they actually prefer the argument and would be disappointed if they couldn't have it.Of course, ethics is something external to physics. But I would like that my 'worldview' is something coherent, a stable unit. It is difficult to 'believe' to have free will half of the time becuase I have to assume it to have a coherent concept of moral responsibility and in the other half 'believe' that I have no free will. Cognitive dissonance is quite a risk. — boundless
It may be that I/we are stretching the language. There's not a lot of popular interest in the modes of existence - even, I suspect, among philosophers.Is it not surprising and disappointing that we still don't have words or phrases for such common things, and can only say things like "mythical creatures (Pegasus, the Gorgons, etc.) exist and not in the way that horses exist"? — Patterner
Yes, it is tempting to treat the future differently from the present or past. Perhaps the ground is that the the future is undetermined while the present and past are determinate and can't change. But one needs to show that this is quite different from the generalized uncertainty that would point out that our belief in X or Z is also defeasible. The determinism, whether logical or causal, will chip in to demolish us completely.What if Y doesn't happen in the future? An uncountable number of things had been "sure bets" never happened. How can Y be real in the sense that either X or Z are real? — Patterner
I think I understand that. So "unicorn" is not an irrelevant example. I like it just because it is not straightforward, but requires some thought. That's much more instructive than the moon.Yes, that's an ontological claim, and of mind-independence. That part is easy, and quite common. The challenge is with where it ends. My topic is about if your opinion is self-consistent, because few think about it further than opinions about what is seen. This is why the moon doesn't matter. — noAxioms
OK.Poorly worded on my part. Typical claim is that "I know the moon exists due to empirical evidence". It's an epistemic claim about ontology, but not directly an ontic claim. — noAxioms
Well, yes. It is an object in the solar system, so it seems a reasonable assumption. Any question about that is pretty much incomprehensible to me.That's a description of how it was created and already assumes the moon shares the same ontology as those solar system events long ago. — noAxioms
You are right. It is curious that we talk of the imaginary friends that some small children have, meaning that they do not exist. Yet it is perfectly possible to imagine something that is real - such as a friend who is absent. My grounds are precisely the ones that you were reaching for - improbability or impossibility. Those grounds are defeasible, but the implausibility of the idea means that it would not be easy to convince me of the opposite - especially in this age of deep fakes!Imagining something presumably isn't what makes it not real. Again, I'm not talking about the concept of something, but about the thing itself. I have a imagined image of the moon, what it's like up there, which doesn't make the moon nonexistent. — noAxioms
Sorry. I meant to explain that.As to the distinction between "exists" and "is real", I had assumed that anything that exists is real — Ludwig V
Contradicting your prior quote: "For me, unicorns exist, all right. But they are not real creatures.". — noAxioms
True. I get a bit confused by "mind-indendent reality", which, pretty clearly is about existence.Different definition of 'real' there. We're discussing ontology, not 'being genuine'. — noAxioms
If I write something like that, you can be pretty sure it is a joke.I've seen whole topics devoted to the latter: "My signature is not mine since it was made by a pen, not by me". Games like that. — noAxioms
You did cite unicorns in your earlier post. It is true that my disbelief in them is defeasible. (Most claims about non-existence are.) But your argument is wildly speculative and does not even begin to convince me. Until there is better evidence, I shall continue to classify them as mythical and claim they don't exist, except in the way that mythical creatures (Pegasus, the Gorgons, etc.) exist and not in the way that horses exist.To be a unicorn, all it needs to be sort of horsey-like with a single horn on its head. There's no requirement to correspond exactly to the human myth .... I don't like the unicorn example because it is so improbably that there is not a planet in the infinite universe somewhere that has produced them. ..... — noAxioms
I see your point. Compare "imaginary". My reply is the same.So again, just because there's a myth about it, why does that preclude the reality of one? It's like you're saying that the myth causes its noexistence. — noAxioms
Thinking about it, I'm really not content to say that past events and future events don't exist. It makes sense to say that all events, past, future and present exist, but in different modes. "X event happend in the past", "Y event will happen in the future", and "Z event is happening now" are all true and all those events are real, hence exist. So I don't accept 2p.There are many definitions, rarely clarified when the word is used. Some examples:- ..... There are other definitions, but that's a taste. Your intuitions seem to lean heavily towards 2p . I favor the relational definition most often since it is far more compatible with quantum mechanics. I've been exploring the 4th one. — noAxioms
I don't think you've got that quite right. Surely, the data are also part of reality? Also, on the face of it, it looks as if you are saying that reality is not (directly) observed, so your problem disappears. I'm not sure about reality, but I'm pretty sure that what counts as real depends on the context. "Real money", "Real food", "Real champagne" all have different definitions.Reality is an interpretation of empirical data. I want to say this is a mind-dependent definition, but it might be too hasty. The apple exists not because it is observed, but its observation suggests an interpretation of reality that includes that apple. Fair enough, but it doesn't say how the interpretation deals with things not observed, and this topic is mostly about that. — noAxioms
That's odd. There must be a story about that.A recent Nobel prize in physics was given for proving this again, despite Bell doing it in the 60's. — noAxioms
Thanks very much for that. It was very helpful.Proving that reality is not locally real means that at most one of the two above principles is true. An example that rejects both principles is objective collapse interpretations. — noAxioms
Yes, I've gathered that modern physics seems to have become something that Bishop Berkeley would have approved of, - apart from the refusal to include God. But there also seems to be very little consensus.This is all quite relevant to the topic, because under most interpretations, the moon is not objectively real, but only real to that which as measured it, which usually means anything that has in any way interacted with it by say receiving a photon emitted by the moon. — noAxioms
It's an excellent topic.Well the fact that you reacted to a comment 500 posts means you've been paying attention to this topic, and I must thank you for that and for your contribution. — noAxioms
It might well. The variations will be very instructive.I don't think my criteria matter at all. It's something that should be explicitly specified by anybody that claims it (sc. mind-independent existence), so it might vary from one view to the next. — noAxioms
If I believe that the moon is exists independently of what I, or anyone else, thinks about it, is that an ontological claim? If so, the mere fact that we categorize or classify something in some way, in my view, is no ground for claiming that it is mind-dependent, though the classification obviously is.Since I consider ontology to be a mental categorization, there's nothing mind independent about it. I'm not asserting that the others are wrong, but I'm trying to explore the consistency of such a view. — noAxioms
I would not dream of claiming that the moon is real because of empirical evidence, because that is not true. The moon exists because of complex events in the solar system, some billions of years ago. We know it exists because of empirical evidence, but that is an entirely different matter."The moon is real because of empirical evidence". Presumably the moon's existence (relative to this planet) is not dependent on humans (or any life forms) observing it, and yet it's existence is justified by observation. I challenge that logic, but to do so, I need to find somebody who supports it. — noAxioms
I don't quite see your point. We can agree that your birds do not exist. But, since you have imagined them, they are imaginary birds, and consequently not real birds, and not real. They don't seem at all problematic. That makes them different from mythical creatures. Mythical creatures such as unicorns have an additional feature. Why would we ignore that?OK, so you draw a distinction between 'exists' and 'is real'. As a mythical creature, it is a common referent. People know what you're talking about, but it seems no more than a concept of a thing, not a thing in itself. I am not talking about the concept of anything, but about the actual thing, so perhaps I should say 'is it real?', or better, come up with an example that is not a common referent such as a bird with 7 wings, all left ones. That at least eliminates it existing as mythology. But instead let's just assume I'm talking about a unicorn and not the concept or myth of one — noAxioms
I must confess that I don't have a firm view of about presentism and eternalism. We seem to have a difference in our understand of "exist". I wouldn't dream of saying that dinosaurs exist in the sense of being alive. I accept that dinosaurs exist in the sense that their remains are still to be found in various places. On the other hand, I do maintain that they did not exist before they evolved in the Triassic period.Your definition of 'exists' seems to be confined to 'exists at some preferred moment in time', which implies presentism, and only membership in this universe. I consider a live T-Rex to exist since I consider 75 MY prior to my presence to be part of our universe. The notion of 'cease to exist' makes like sense to me. I also don't confine existence to our universe which is why I call it 'our' universe instead of 'the' universe. I find presentism to be a heavily mind dependent view. Just saying... — noAxioms
The bottom line, then, is that the answer depends on your definition of "exist" and "real".The bottom line should be an answer to my question. Do real unicorns (not the myth) exist or not, and how might that answer be justified? Perhaps unicorns are again a bad example of mind-independence because they presumably implement mental processes of their own. Perhaps we should discuss some questionable inanimate entity. — noAxioms
Yes, that's true. I wouldn't hesitate to call either of those cases thermostats, because in each case, they are part of a living system or part of a living being. On the other hand, if we found an inanimate system that included a feedback loop that tended to maintain itself in a steady state, I would hesitate to call it a thermostat, but probably come down on the side of doing so, on the grounds that it is at least analogous to what we now call a thermostat.So an alien-made device on a planet out of our access cannot be called a thermostat by us? How about the temperature regulatory systems that the first warm-blooded animals evolved? Both those are sans-human-context. — noAxioms
"Real" is more complicated that "red" or "large". Many, if not all, objects can be classified in several ways, according to context and point of view. Things can be real under one designation and not real under another. As to reality as philosophers debate it, I don't really understand what they are talking about - unless they mean real things in general. But since what is real depends on how it is described, that doesn't mean very much to me. "Real" does not mean "Ideal". On the contrary, the real is quite often opposed to the ideal.Reality is an interpretation of empirical data. That's what I'm calling an interpretation here. People interpret that data differently, so there's all these different opinions of what is real. If being real is no more than an ideal (a mental designation), then there's no truth to the matter. — noAxioms
Could you please enlighten me - What is "local realism"?Yes, local realism has been falsified. — noAxioms
I must confess, when I've come across that argument, I haven't found it particularly interesting. So I'm not disappointed by that conclusion.Having said that, and having floated the idea that ontology is a mental designation, it would seem to follow that presentism and eternalism are the same thing, just interpreted differently, an abstract different choice without any truth behind it. I hadn't realized that until now. — noAxioms
I agree. The interesting part is which items qualify as mind-independent and under what criteria.As first responder herein, I admitted to unabashedly supporting mind-independent reality, which makes explicit something that is, and is necessarily, regardless of what I think about it. — Mww
I'm a bit puzzled by this. Why can't it be both?But is 'temperature' a property of things outside our conceptual categories or is a concept we introduced to make sense of our experience? — boundless
I think it helps. I don't think there is much missing in the physical explanation of a rainbow. A rainbow, understood as we perceive it and conceive of it, is one lange-game or practice. However, the same - what shall I call it? --phenomenon understood in physical terms, is another.Perhaps 'understood in its own terms' was what I meant. — Wayfarer
Well, I guess that's an opening for me to chip in. I do have a problem, however, that I haven't got my head around what the criteria are for mind-independent existence. But I can explain what I understand about unicorns. Perhaps that will help.the question of this topic is not about the moon, but about the unicorn. If the unicorn exists, why? If it doesn't, why? Most say it doesn't, due to lack of empirical evidence, but if empirical evidence is a mind-dependent criteria. Sans mind, there is no empirical evidence to be considered.
— noAxioms
Here we are 500 posts in, and I don't think this has been answered. Lack of it is why I suggests that nobody really supports mind independent existence. — noAxioms
Perhaps we should resist the equation of explaining something with reducing it. Physics can only explain things in certain terms. We live with things in different terms. But it's a matter of point of view - context and use - not a metaphysical problem - unless we choose to make it so.Big 'if'. If mind (or life, or intelligence) is truly not reducible, then it's also not really explainable in other terms. — Wayfarer
There are definitely strange things going on in physics. I don't pretend to understand them, or even like them, but I suspect it will look very different in the future and our current obsessions will begin to seem as antiquated as Aristotle.But my current philosophy/physics book, by James B. Glattfelder*1 inadvertently raised an economic issue that also has political-philosophical significance. — Gnomon
The feudal system was developed in a much simpler society, in which money played a much smaller part than it does in ours. It was more concerned to regulate brute power rather than financial power. The development of international (in fact global) trade and of technology changed all that. I don't think there's any going back, though now that inheritance of money or at least the advantages gained for children through money is so much more important than it was does re-introduce an important element of the feudal system.*4. The feudal system — Gnomon
That's not wrong. Money equals control of resources in a functioning political system. One of the primary duties of a government is to ensure that is maintained. However, money is also a power base that is an alternative to the vote, and is perfectly capable of subverting it. That's why distribution of financial resources is not just an ethical question, but a political one.Money equals power; Power makes Law; Law makes Government
You are quite right about the myth of democracy. It has been incredibly damaging and arguable led us to the crisis that we are now facing.There is a myth about democracy which sees it as both good and natural. With this myth comes ideas which say that non-democratic rule in Russia or China or the United States is an aberration, and that any deviation from pure democracy must be bad. Historically speaking, these are not aberrations at all. In fact democracy is historically recognized to be a rather dubious form of government, which is why even before the constitutional convention we had significant checks on democratic forms. — Leontiskos
That I disagree with - although I don't know why Aristotle thought that, so I could be persuaded. I think the outcomes are what matters. After all, almost every regime is based on pure force. Though if you start with a selfish dictatorship, it does seem unlikely that the regime will turn into a benevolent government any time soon.One of Aristotle's many contributions is that the goodness or badness of any particular regime must always be judged relative to where it began. This is true regardless of one's regime hierarchy. — Leontiskos
That optimism is a major cause of our problems now. That's why I think that revolution is, of itself, a Bad Idea. Reform is more likely to succeed.Even if someone thought that democracy was the greatest thing since sliced bread, it would nevertheless remain true that the Soviet Union cannot be expected to shift from communism to democracy in the blink of an eye, and that the fall of the Berlin wall is not necessarily teleologically oriented towards a democratic regime. — Leontiskos
I don't disagree with that. The problems with conscription are partly ethical and partly practical. So conscription even of adults is a step over the line. Conscription of children is worse than conscription of adults. All I'm saying is that in time of war, ethics often comes under pressure and people often step over the line rather than lose. Perhaps they may justify it as the lesser of two evils - and others may well disagree.Well the notion of in extremis is a central part of ethics, and I don't see why one couldn't be ethically prepared to accept conscription while at the same time being ethically unprepared to accept the conscription of children. — Leontiskos
I don't think conscription is OK. Period. Nobody likes it, not even the army. If you have to force someone to join the army (or navy, air force, whatever) they are somewhat unlikely to make good soldiers, beyond getting lined up to be shot at. But it is a fact of life.I guess conscription is different if we think it is okay to conscript children, but I don't think that. It seems as though conscription also entails adulthood. — Leontiskos
Yes. Good point. Thank you.strategic tolerance is motivated also by a perceived common ideological enemy: Christianity and Capitalism can ally against Communism, progressive socialism and conservative nationalism can ally against Capitalism, etc. — neomac
Yes, indeed. I suspect that motive is very much present in this case.I just see some moves made in politics as being about gaining immediate votes rather than creating a better system. — I like sushi
Yes. Getting those in power to vote for something that will make their lives more difficult is not easy. IMO, In 2010 the Libdems, once they were in coalition, realized that they might one day get power without PR. They accepted a feeble compromise rather than put their power-sharing deal on the line.It was one time where Labour and The Conservatives joined forces as it was mutually benefical for them to keep the current system. — I like sushi
Yes. They both make a lot of sense to me.Here again we find the issue tha both Popper and Berlin talked about, — I like sushi
One reason I didn't much like that reform was precisely because of the slippery slope. But that works both ways. I don't see a good reason for not raising the age of majority to 25, for all the reasons that you give for not reducing it to 16. Impossible in practice, I know. On the other hand, I don't think it matters very much, so long as there is consensus, or at least acquiescence, and the system works reasonably well.Would be better return to 1969 where the minimum age was 21 imo. — I like sushi
Infants don't have a lot of power. But they don't hesitate to use what they do have, in my experience. Children are always pushing at the boundaries. Just like adults.Infants do not question or ask, they simply live according to their biological requirements and remain largely passive. — I like sushi
Yes, it is, if you are thinking of volunteering. It's a life-and-death decision. Conscription is different. There's an ambivalence here between the soldiers as heroic defenders laying their lives on the line and soldiers as cannon-fodder.The argument that military service entails adulthood is very strong. — Leontiskos
Insofar as recognition of another ideology as disagreeing with oneself means recognizing (often at the same time as denying) that the other side are also human beings. In a rational world, that should be a basis for working out how to co-exist. But I realize that's somewhat idealistic.Is true coexistence between ideologies even possible when they’re wired to see each other as threats to their own legitimacy? Maybe the real obstacle isn’t just disagreement—but the fact that many ideologies survive by creating an ‘us vs. them’ narrative. If you’re only making room for another worldview because you think yours will still win, is that coexistence... or just strategic tolerance? — Alonsoaceves
Yes. I know. But I thought it was a theoretical discussion.Policy wise I don't think so. Voting exams are bad news. — fdrake
I didn't suggest assuming anything. On the contrary, I suggested evaluating the information and making a decision on that basis. I'm also suggesting that If you are so worried about 16-year-olds voting on the wrong criteria, you look at all the other voters who do the same thing.It is unreasonable to assume something when there is plenty of hard scientific evidence showing how adolescent brains are far less risk averse, immature in term of planning, managing emotions and delay gratification. — I like sushi
Do you mean that someone will have to write these tests of competence - with the issue that a miracle of dispassionate objectivity would be needed? The history of tests of voter competence is, how shall I put it, compromised."Yesterday I didn't know there was a curriculum, and today I'm writing it". — Banno
Is that because they can't, or because we don't ask them to?Someone who's 14 is not expected to analyse literature, write a discursive essay, or read and interpret a graph though. — fdrake
Wouldn't it make more sense to test for what you are looking for. Awareness and balanced judgement of public affairs. Such tests as these can't give us what we want. They can't provide an objective, impartial, accurate qualification for voting. It has to be fully automatic and undoubtedly will be rough and ready.I'm not saying that you ought to be able to do these things to vote — fdrake
That seems reasonable. But once you have set that criterion, doesn't elementary justice mean that it should be applied to voters of all ages?As for senile dementia, I see no reason they should still be able to vote. — I like sushi
Sure. But the question is whether that difference makes a difference. Given that the system is very rough and ready, it doesn't seem unreasonable to me to think that it does not. Intellectually, we're on a slippery slope and political views are, of course, in play.There is a big difference between 16 and 18 yrs of age. — I like sushi
If I've got it right, the prefrontal cortex doesn't stop developing until around 25. So that ship sailed long, long ago.The prefrontal cortex needs to develop. This is not something we can simply dismiss. — I like sushi
Enough said, I think.Many forms of Government have been tried, and will be tried in this world of sin and woe. No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time.… — "Churchill
Quite. I can't see that they will wreck the overall result.Let them vote it'll be good for them. — fdrake
That's been the classic arguments against democracy since the Athenian expedition to Sicily 415-413 BCE. Plato builds a political philosophy around it.If we create a poltical body that is increasingly dependent upon the short-term whim of inexperienced minds - who are biologically driven by a myopic perspective - then I fear for the long-term future. — I like sushi
Yes. People do seem to focus on the fact that there has not been a world war since 1944. Whether it is appropriate, on that ground, to call the last 80 years a period of relative harmony and peace is not obvious to me. But I do agree that we seem to be in a particularly critical and unstable time. We live interesting times, unfortunately.Either way, our intuitions will lead us on more than our knowledge. When they meet each other then we have a period of relative harmony and peace (like now). — I like sushi
This reminds me of the Aristotle's practical syllogism, which is supposed to give a structure that applies to all actions whatever. In a way, it does, in the sense that you can shoe-horn actions into the formula. The same applies to Aristotle's syllogism, which was thought, for a long time, to give the structure of all arguments. What in fact happened was that arguments were shoe-horned into that structure, which was not particularly helpful. What tells you that the betting structure applies to all actions? The fact that you can shoe-horn things into the structure is not enough.The betting structure shows gives us a way of understanding what a belief and preference amount to, using just behaviour. — Banno
I'm interested in the limitation. Can you give me an example of an inappropriate use? Do you mean that in the inappropriate uses, better does not entail worst and best?Better entails worst and best, in itself, by definition, in every appropriate use. We need that to be the case, to use “better” at all. — Fire Ologist
"Consistent" and "Coherent" only apply to a number of elements that relate to each other - that is, to a system. "Inconsistent" and "incoherent" mean "not systematic".So good philosophy can completely forego the devotion to “ identifying and clarifying consistent/inconsistent and coherent/incoherent relations internal to systems/models”? — Fire Ologist
You put the difference very neatly. Only, I didn't intend it as a criticism, but as an analysis.Ramsey offers a minimal account of the nature of belief, while the Bayesian account assigns a value to a belief without specifying what that belief might be. Ramsey gives an account of belief’s nature; Bayesianism gives a rule for belief’s revision. — Banno
Well done! I found a copy of the chapter on some obscure web-site, but couldn't find any attribution - which was a little frustrating.Got it: — Moliere
Yes. You see how your thinking is conditioned by risk and reward in relation to your resources. Yes, of course, it is a non-standard, even contrarian, decision, but nonetheless, the amount you will bet is not an index of your belief, but the result of several interacting factors. To fnd the strength of belief, you have to work through all those factors.I'll bet the same against you, on the odds that it doesn't -- given I have nothing and I could win on the bluff I might as well. — Moliere
I like the twist that events will take you back to philosophy, because there has to be agreement on the outcome.I distrust betting on the whole. It's a test of who is right and who is wrong -- so I can persuade a person to bet against that the LNC* is false in at least one circumstance, and then provide the argument from the liar's sentence (which will certainly not persuade), and we'd be right back doing philosophy again rather than betting. — Moliere
I'm sorry. I don't remember what the buy-in was.That's what I meant to imply by the 1 million dollar buy in before. — Moliere
I'm sorry. I don't see what you are getting at.The philosophical move is from the action representing the belief to the action constituting the belief. — Banno
I can see the link between the two. But I don't see how that fits with what @Banno says.So, bets, promises, posts on one hand and paying up, following through, and reading on the other. — Moliere
That's a rather charitable interpretation of "forced".I don't think Zizek is denying the possibility of changing views. He just remarks how painful it can often be and how often that this change can hardly come by our own intellectual initiative. — neomac
Perhaps I just don't understand the situation very well.I don't think however that any of such considerations clarify the nature of ideological thinking and how it epistemically compels us. — neomac
Well, gambling was important in the development of probability theory from the beginning. So it's no surprise that it crops up here. More than that, it's true that people do sometimes challenge a claim that they disagree with by suggesting the proposer puts their money where their mouth is. But I'm irritated that, in this context, people talk as if the size of the bet is somehow an index of the strength of the belief. Outside of artificial situations in labs, that's just not the case. A bet is a balance between risk and reward assessed in the context of the degree of confidence and in the wider context of the bet.The betting structure shows gives us a way of understanding what a belief and preference amount to, using just behaviour. — Banno
I'll look forward to that.This needs a good example. I'll work on it. — Banno
It's a good point. Yet arguments do fly back and forth between ideologies, even though in principle they do not recognize how radical the break is at this level. However, here's a problem. If a given conceptual scheme is incommensurable with another, not even opposition or rejection are really possible.Common to Wittgenstein’s forms of life and hinges , Heidegger’s worldviews, Foucault’s epistemes and Kuhn’s paradigms is a rejection of the idea that social formations of knowledge progress via refutation. It sounds like your critique of ideology is from the right, which places it as a pre-Hegelian traditionalist thinking. — Joshs
Yes. My only qualification is that the practices are likely not only be based on ideological positions, but will also tend to re-inforce, even enforce, them.On the contrary, scientific, legal, professional reporting practices presuppose supporting ideologies for such practices to thrive and inform social life. Indeed, all these procedures can as well be compromised by ideological struggles. — neomac
Well, people do change their ideological stance from time to time. We're more or less committed to the view that standard rationality does not apply at this level. So the question becomes, what approaches and factors actually work? And, crucially, can we distinguish between fair and unfair ways of doing this. I suspect that, in the end, it will be a matter of teaching and allowing the persuadee to absorb and reflect on what they learn. (Very roughly).Why are these the only two options? Why couldn't I teach someone a different way of looking at world, the way which grounds my own arguments and facts, so that they can understand the basis of my criteria of justification? It would not be a question of justifying the worldview I convert them to, but of allowing them to justify the arguments and views that are made intelligible from within that worldview. — Joshs
Strictly speaking, in my view, it is not really appropriate to call an ideology irrational, because usual standards of rationality do not apply between ideologies. There's also the point that it is misleading to dismiss one's ideological opponents as irrational - unless one is happy to accept that one's own ideology is irrational.he link between “necessity” and “irrationality” of ideological thinking as discussed in the opening post, and distances itself from more psychological understanding of ideologies (evil intentions, stupidity, comforting delusions ) which I find rather misleading (if not even, ideologically motivated!). — neomac
Yes. That's how philosophers will need to think about it. But there's more than thinking involved in ideology. Praxis is also very important in understanding what it means.So ideology is the most basic form of coordination for social grouping to support a given informational flow within a society and political mobilisation. — neomac
Zizek is wrong. Some American PoW's in the Korean War switched sides. I've seen one interview (which doesn't make a summer, I accept, but..) in which an American ex-PoW switched sides because he came to see American ideology through Chinese eyes - no force was required. The fundamental point was that the Chinese treated him better than the Americans. There's more to the story, or course, and I'm sure Google will find it for you if you want. But I don't accept what Zizek is saying. Seeing through one's own ideology is not easy, but it can be done.Zizek, in that video, is giving a psychological explanation for why liberation from one own’s ideology needs to be forced on people — neomac
Do we have a disagreement? — Banno
The key word there is "revising".We have it from Ramsey and others that there are solid statistical methods for comparing and revising various beliefs, and we agree that these are A Good Thing. — Banno
Help with consistency is always a good idea. Dropping induction, I fear, may be more difficult. Pavlovian conditioning works at levels beyond the reach of voluntary control.Better to drop induction all together and instead look at how a bit of maths can help show us if our beliefs - held for whatever reason, or no reason at all - are consistent. — Banno
I prefer this Humean explanation. But I thought that since the fifties and sixties, we had all given up worrying about the deductive invalidity of induction. Why are we revisiting the past? I'm sure the Bayes process has its place, but I don't really see why induction needs to be replaced or even can be replaced. There is one thing the Bayes process can do that cannot be done any other way - it can give us some help in dealing with one-off probabilities. (Not even induction can do that!)Instead of seeking justification for induction, he explains how we act as if inductive reasoning were valid. — Banno
OK. That makes sense.The model is your idea of how some aspect of the world works. It provides the probabilities of various outcomes. — GrahamJ
I'm trying to keep the enthusiasm for Bayes in proportion by anchoring our conversation in how we do things, or how we think we do things, when we aren't relying on Bayes. I'm trying to work out whether we can rely on Bayes or not. At present, the assumption is that we can. My mind is not made up.You have talked quite a bit about making decisions under uncertainty - about medical treatments, weather forecasts, coin-tossing, and beer in fridges. I was replying to all of that and I may have confused things by quoting a particular paragraph. I wasn't trying to 'run it backwards' to interpret a decision. — GrahamJ
There's so much going on that it is very hard to keep up with everything. I'm afraid I don't even try.I just realized I missed a comment of yours to my quote — neomac
There's a reason why I'm not. I oscillate between thinking that if only everybody would play nice, how much better it would be and thinking that we need someone even heavier than the heavies we have to knock heads together. Neither suggestion is particularly helpful, I know.I do not disagree with your general claims but they do not offer any concrete path toward peaceful coexistence. — neomac
Yes, the enemy of my enemy is my friend - at least until our common enemy is defeated, when any thing may happen. One of the differences between our situation now and the situation up to about 2000 is that we no longer live in a world with just one dominating struggle, but a multi-polar, multi-struggle world. Whether that's better or worse, I wouldn't like to say.Often competing ideologies can converge when there is a third ideology perceived as common threat — neomac
Well, I can see that a Dutch book would be a bad idea. On the other hand, there is the possibility of a "Czech book", in which the probabilities add up to less than 1. Wikipedia, which is never wrong, tells me that always pays out to the gambler.In Bayesian probability, Frank P. Ramsey and Bruno de Finetti required personal degrees of belief to be coherent so that a Dutch book could not be made against them, whichever way bets were made.
That sounds wonderful, and better than the sceptical bewailing of our failure to match the traditional expectations.He tells us what it means to be coherently uncertain — to reason, act, and believe in a way that fits together, even when the world is incomplete, and we are fallible. — Banno
That seems to be right. Given the hostility that there so often is between ideologies, I would expect that to be a major factor in how people decide to draw the lines.I think any boundaries between distinct ideologies are theoretical and made for a purpose. Consider, that no two people really share all their believes, so in that sense we could say that everyone has one's own distinct ideology. But on the other hand, if we limit a particular "ideology" to just a small set of very. general ideas, then many people have the same ideology. So the drawing of lines between ideologies is complex and purposeful, yet somewhat arbitrary. — Metaphysician Undercover
Perhaps "correctly" is over-stating it. But it is also possible to revise my interpretation in the light of more and better information or even to actually misinterpret my actionBut neither of us want to say that. — Banno
