However, historically, the "new Baconian science," the new mechanistic view of nature, and nominalism pre-date the "Great Divergence" in technological and economic development between the West and India and China by centuries. If the "new science," mechanistic view, and nominalism led to the explosion in technological and economic development, it didn't do it quickly. The supposed effect spread quite rapidly when it finally showed up, but this was long after the initial cause that is asserted to explain it.
Nor was there a similar "great divergence," in technological progress between areas dominated by rationalism as opposed to empiricism within the West itself. Nor does it seem that refusing to embrace the empiricist tradition's epistemology and (anti)metaphysics has stopped people from becoming influential scientific figures or inventors. I do think there is obviously some sort of connection between the "new science" and the methods used for technological development, but I don't think it's nearly as straightforward as the empiricist version of their own "Whig history" likes to think.
In particular, I think one could argue that technology progressed in spite of (and was hampered by) materialism. Some of the paradigm shifting insights of information theory and complexity studies didn't require digital computers to come about, rather they had been precluded (held up) by the dominant metaphysics (and indeed the people who kicked off these revolutions faced a lot of persecution for this reason).
By its own standards, if empiricism wants to justify itself, it should do so through something like a peer reviewed study showing that holding to logical positivism, eliminativism, or some similar view, tends to make people more successful scientists or inventors. The tradition should remain skeptical of its own "scientific merits" until this evidence is produced, right? :joke:
I suppose it doesn't much matter because it seems like the endgame of the empiricist tradition has bifurcated into two main streams. One denies that much of anything can be known, or that knowledge in anything like the traditional sense even exists (and yet it holds on to the epistemic assumptions that lead to this conclusion!) and the other embraces behaviorism/eliminativism, a sort of extreme commitment to materialist scientism, that tends towards a sort of anti-philosophy where philosophies are themselves just information patterns undergoing natural selection. The latter tends to collapse into the former due to extreme nominalism though.
Pluralism, as I understand it, allows different epistemological perspectives, with different conceptions of what is true within those perspectives.
It also encourages discussion between perspectives, including how conceptions of truth may or may not converge.
Relativism (about truth) would deny even this perspectival account as incoherent. (A very broad-brush picture of a hugely complicated subject, of course.)
What is it the critic wants to conclude - that our use of the word is grounded in a pre-linguistic understanding of what water is? Perhaps we learn to drink and wash before we learn to speak. But learning to drink and wash is itself learning what water is. There is no neat pre-linguistic concept standing behind the word, only the way we interact with water as embodied beings embedded in and interacting with the world. Our interaction with water is our understanding of wate
So on one hand we have a triadic {water – concept-of-water – use of water}; on the other just water being used.
You and Tim objecting to formal modal logic robs you both of the opportunity to present your arguments clearly.
The suggestion that formal logic is restricted to analytic philosophy is demonstrably ridiculous
What should I do? Is it OK for me to just shoot you, in order to eliminate dissent? Should I do what the One True Explanation of Everything demands, even if that leads to abomination?
Funny, how here we are now moving over to the ideas entertained in the thread on Faith. I wonder why.
You know, those basic liberal virtues. How much worse would a world be in which only the One True Explanation Of Everything was acceptable, uncriticised?
Pluralists can accept many truths within different practices - physics, literature, religion, without affirming logical contradictions. But this doesn’t mean that "2+2=5" and "2+2=4" are both true. Pluralism has limits, governed by coherence, utility, and discursive standards.
I think this a much more wholesome response than supposing that some amongst us have access to the One True Explanation and the One True Logic.
Misology is not best expressed in the radical skeptic, who questions the ability of reason to comprehend or explain anything. For in throwing up their arguments against reason they grant it an explicit sort of authority. Rather, misology is best exhibited in the demotion of reason to a lower sort of "tool," one that must be used with other, higher goals/metrics in mind. The radical skeptic leaves reason alone, abandons it. According to Schindler, the misolog "ruins reason."
If we return to our caricatures we will find that neither seems to fully reject reason. The fundamentalist will make use of modern medical treatments and accept medical explanations, except where they have decided that dogma must trump reason. Likewise, our radical student might be more than happy to invoke statistics and reasoned argument in positioning their opposition to some right wing effort curb social welfare spending.
Where reason cannot be trusted, where dogma, or rather power relations or pragmatism must reign over it, is determined by needs, desires, aesthetic sentiment, etc. A good argument is good justification for belief/action... except when it isn't, when it can be dismissed on non-rational grounds.In this way, identity, power, etc. can come to trump argument. What decides when reason can be dismissed? In misology, it certainly isn't reason itself.
In that sense, our version of reality or truth functions similarly to how language works; it doesn’t have a grounding outside of our shared conventions and practices.
The position isn’t that truth is mere popularity, but that truth is built through ongoing conversation and agreement. What counts as true is what survives criticism, investigation, and revision within a community over time. So instead of certainty, we have a fallible and evolving consensus. Tradition, in such a context, is something that should be investigated and revised if necessary.
Humans work to create better ways to live together,
We settle, at least for a while, on what works
having conversations about improvement,
I think you'd see, rereading, that this isn't accurate.


Yes, in a way, but I think reality comes first. I think we have to have some familiarity with water before we have any sensible familiarity with "water."
I am not a relativist about truth
Of course, so much so that I'd hesitate to talk about "truths" here at all. Or maybe I don't understand what a non-context-dependent truth about a philosophy would be.
Nor do I think that acknowledging "pluralistic, context-dependent truths" makes someone a relativist.
Subject to certain purposes, you might say.
what we can point to is broad agreement,
shared standards
and better or worse outcomes within a community or set of practices.
The mistake comes when we think we've consulted the Philosophical Dictionary in the Sky and discovered what is Really Real.
What I got from @Banno seems to be that pluralistic or context-based truths don’t mean that every contradiction is true. Instead, truths depend on the situation, purpose, or point of view.[/quote
Of course. Just the ones that are useful to affirm are "true"... and "false." Maybe neither too. Perhaps in the interest of greater tolerance we shall proclaim in this case that there both is and is-not a One True Truth (TM)?
But that doesn't really seem to work. To say "is" and "is-not" here is really just to deny "is." Yet can it be "wrong" to affirm the "One True Truth" in this case?
When contradictions happen, it usually means they come from different ways of looking at things -not that truth doesn’t exist
I imagine you’re unlikely to be a Rorty fan, but didn’t he say that truth is not about getting closer to some metaphysical reality; it’s about what vocabularies and beliefs serve us best at a given time?
Well it may well be useful for one's survival to accept that Big Brother is right, so at one level (that of ruthless pragmatism) sure. But being compelled to believe something out of fear of jail or death is a different matter altogether, isn't it?
Like Macbeth, Western man made an evil decision, which has become the efficient and final cause of other evil decisions. Have we forgotten our encounter with the witches on the heath? It occurred in the late fourteenth century, and what the witches said to the protagonist of this drama was that man could realize himself more fully if he would only abandon his belief in the existence of transcendentals. The powers of darkness were working subtly, as always, and they couched this proposition in the seemingly innocent form of an attack upon universals. The defeat of logical realism in the great medieval debate was the crucial event in the history of Western culture; from this flowed those acts which issue now in modern decadence.
IS there some conclusion that you would like to draw from all this?
I agree whole-heartedly that the notion that one has grasped an Absolute Truth is extremely dangerous. It makes it impossible to acknowledge and tolerate any disagreement. I cannot think of a situation in which this might be a a Good Thing, but I can think of many in which it is clearly a Bad Thing.
Isn't your take informed by a bias that values traditionalism and is suspicious, perhaps even hostile towards political radicalism (particularly of the Left)? Is your use of irony as Rorty uses it? Is 'unseriousness' how they would describe it, or is that your description for it? There's a further quesion in what counts as a politically radical circle?
You'd imagine this is fairly common today. Why do you find this more pernicious?
Custom may have once served a purpose, Mill acknowledges—in an earlier age, when “men of strong bodies or minds” might flout “the social principle,” it was necessary for “law and discipline, like the Popes struggling against the Emperors, [to] assert a power over the whole man, claiming to control all his life in order to control his character.”9 But custom had come to dominate too extensively; and that “which threatens human nature is not the excess, but the deficiency, of personal impulses and preferences.”10 The unleashing of spontaneous, creative, unpredictable, unconventional, often offensive forms of individuality was Mill’s goal. Extraordinary individuals—the most educated, the most creative, the most adventurous, even the most powerful—freed from the rule of custom, might transform society.
“Persons of genius,” Mill acknowledges, “are always likely to be a small minority”; yet such people, who are “more individual than any other people,” less capable of “fitting themselves, without hurtful compression, into any of the small number of moulds which society provides,” require “an atmosphere of freedom.”11 Society must be remade for the benefit of this small, but in Mill’s view vital, number. A society based on custom constrained individuality, and those who craved most to be liberated from its shackles were not “ordinary” people but people who thrived on breaking out of the customs that otherwise governed society. Mill called for a society premised around “experiments in living”: society as test tube for the sake of geniuses who are “more individual.”
We live today in the world Mill proposed. Everywhere, at every moment, we are to engage in experiments in living. Custom has been routed: much of what today passes for culture—with or without the adjective “popular”—consists of mocking sarcasm and irony. Late night television is the special sanctuary of this liturgy. Society has been transformed along Millian lines in which especially those regarded as judgmental are to be special objects of scorn, in the name of nonjudgmentalism. Mill understood better than contemporary Millians that this would require the “best” to dominate the “ordinary.” The rejection of custom demanded that society’s most “advanced” elements have greater political representation. For Mill, this would be achieved through an unequal distribution of voting rights...
Society today has been organized around the Millian principle that “everything is allowed,” at least so long as it does not result in measurable (mainly physical) harm. It is a society organized for the benefit of the strong, as Mill recognized. By contrast, a Burkean society is organized for the benefit of the ordinary—the majority who benefit from societal norms that the strong and the ordinary alike are expected to follow. A society can be shaped for the benefit of most people by emphasizing mainly informal norms and customs that secure the path to flourishing for most human beings; or it can be shaped for the benefit of the extraordinary and powerful by liberating all from the constraint of custom.
Thinking through this question now -- Kuhn's Structure of Scientific Revolutions is what I have in mind, but with a more materialist mindset which doesn't give into the notion that nature itself changes with the sciences.
I tend to favor the epistemic side over the ontology side -- I understand it's basically a "player's choice", but it's my preference. On the reverse of "How do you know unless you start with what is?" is "How do you know what is unless you start with what you know?"
I think it's in virtue of the things our species relies upon water for -- drinking, cooking, bathing, etc.
There has never been a nominalist, or rather, individualist country
No, the fact that something looks like a human being makes something a "human being".
It means that everything we know about human beings is derived from the senses and experience.
It's more like "realism is false because no one can find universal or abstract object". One of the common objections from nominalism against realism is that forms and universals and abstract objects cannot be found.
Simply that he looks like other human beings.
One thing is for certain, we are not developing these general ideas by looking at forms and essences.
The notion that one attaches and removes dignity to terms and definitions in order to dignify a human being is precisely the threat that I’m talking about. When one dehumanizes, like calling people rats for example, nothing at all changes in any individual human being outside the realist skull, but his treatment of them certainly does.
But if someone kills another for some the sake of some name like “country” or “God”, then we have an instance of destroying what is boundlessly more valuable for the sake of an idea or figment.
Before Copernicus there was overwhelming evidence of the spheres having and will always being in existence.
I think the underdetermination argument is what undermines this notion -- it's what I'd guess now, but it could be that we're reading patterns into the past that we accept now which are predictive and make sense,
Are the particulars not as worthy of being loved, admired, or understood as the abstractions and universals the realist holds dear?
I'm hoping someone can point me in the direction of those who see realism as a threat, and we can continue this ancient battle on an even footing.
“ . . . a position no one familiar with philosophical inquiry could take seriously.”
And I'm sure you're right that a "crude relativist" could leave a discussion worse off than they found it, by accusing people who aren't relativists of being wrong. I hope we agree that this doesn't characterize a position that anyone could take seriously.
I may never understand your rhetorical habit of contrasting Position A with a Position B that no one has ever espoused!
In short, if you start from premises you believe you can show to be foundational, does that commit you to also saying that everything that follows is rationally obligatory?
What would the opposite of this be? You start with premises that are foundational and then refuse to affirm what follows from them?
So:
P
P→Q
But then we affirm:
~Q, or refuse to affirm Q.
Yes, this is what most people would call "irrational." No?
That's why indisputably foundational premises might be abandoned in favor of something closer to epistemic stance voluntarism. This may not be a worry for you, but many philosophers, myself included, are concerned about the consequences of rational obligation which do seem to follow, as you correctly show, from allegedly indisputable premises. The idea that there is only one right way to see the world, and only one view to take about disagreements, seems counter to how philosophy actually proceeds, in practice, and also morally questionable.
