This may have no appeal for you, but I was quite pleased with the papers cited (by Chakravartty and Pincock) in the "Epistemic Stances . . . " thread. I thought those two philosophers did an excellent job making big issues clear within a smaller, manageable discussion. Would you be willing to read them, perhaps guided by some of the comments in the thread? At the very least, you'd see that the "either it's foundationally true or it's merely useful" binary is not the only stance available.
I think what bothers some people is that "true in a context" is seen as some inferior species of being Truly True. It's hard, perhaps, to take on board the idea that context is what allows a sentence to be true at all. If a Truly True sentence is supposed to be one that is uttered without a context, I don't know what that would be.
To see why meaning cannot be contained within external signals, consider a program that randomly generates any possible 3,000 character page of text. If this program is allowed to run long enough, it will eventually produce every page of this length that will ever be written by a person (plus a vastly larger share of gibberish). Its outputs might include all the pages of a paper on a cure for cancer published in a medical journal in the year 2123, the pages a proof that P ≠ NP, a page accurately listing future winning lottery numbers, etc.¹¹
Would it make sense to mine the outputs of such a program, looking for a cure to cancer? Absolutely not. Not only is such an output unfathomably unlikely, but any paper produced by such a program that appears to be describing a cure for cancers is highly unlikely to actually be useful.
Why? Because there are far more ways to give coherent descriptions of plausible, but ineffective treatments for cancer than there are descriptions of effective treatments, just as there are more ways to arrange the text in this article into gibberish than into English sentences.¹²
The point of our illustration is simply this: in an important sense, the outputs of such a program do not contain semantic information. The outputs of the program can only tell us about the randomization process at work for producing said outputs. Semantic information is constructed by the mind. The many definitions of information based on Shannon’s theory are essentially about physical correlations between outcomes for random variables. The text of War and Peace might have the same semantic content for us, regardless of whether it is produced by a random text generator or by Leo Tolstoy, but the information theoretic and computational processes undergirding either message are entirely different.
I enjoyed your response, plenty to look up. Can I ask you why you are drawn to medieval philosophy? Not an area I know much about. Feel free to recommend any 'essential' texts, I got a lot out of reading your last one!
This might be a dumb question, but how is it a given that moral virtue is an epistemic virtue?
Knowledge plays an essential role in ethics. It seems obvious that human beings often fail to act morally. Yet just as importantly, we often disagree about moral issues, or are uncertain about what we ought to do. As Plato puts it: “[we have] a hunch that the good is something, but [are] puzzled and cannot adequately grasp just what it is or acquire… stable belief about it.”1 In light of this, it seems clear that we cannot simply assume that whatever we happen to do will be good. At the very least, we cannot know if we are acting morally unless we have some knowledge of what moral action consists in. Indeed, we cannot act with any semblance of rational intent unless we have some way of deciding which acts are choiceworthy.2 Thus, knowledge of the Good seems to be an essential element of living a moral life, regardless of what the Good ultimately reveals itself to be.
Yet consider the sorts of answers we would get if we were to ask a random sample of people “what makes someone a good person?” or “what makes an action just or good?” Likely, we would encounter a great deal of disagreement on these issues. Some would probably even argue that these terms cannot be meaningly defined, or that our question cannot be given anything like an “objective answer.”
Now consider what would happen if instead we asked: “what makes someone a good doctor?” “ a good teacher?” or “ a good scientist?” Here, we are likely to find far more agreement. In part, this has to do with normative measure, the standard by which some technê (art or skill) is judged vis-à-vis an established practice.3 However, the existence of normative measure is not the only factor that makes these questions easier to answer. Being a good doctor, teacher, or scientist requires epistemic virtues, habits or tendencies that enable us to learn and discover the truth. The doctor must learn what is causing an ailment and how it can be treated. The teacher must understand what they are teaching and be able to discover why their students fail to grasp it. For the scientist, her entire career revolves around coming to know the causes of various phenomena—how and why they occur.
When it comes to epistemic virtues, it seems like it is easier for people to agree. What allows someone to uncover the truth? What will be true of all “good learners?” A few things seem obvious. They must have an honest desire to know the truth. Otherwise, they will be satisfied with falsehoods whenever embracing falsehood will allow them to achieve another good that they hold in higher esteem than truth.i For Plato, the person ruled over by reason loves and has an overriding passion for truth.1 Learning also requires that we be able to step back from our current beliefs, examine them with some level of objectivity, and be willing to consider that we might be wrong. Here, the transcendence of rationality is key. It is reason that allows us to transcend current belief and desire, reaching out for what is truly good. As we shall see, this transcendent aspect of reason will also have serious implications for how reason relates to freedom.
Learning and the discovery of truth is often a social endeavor. All scholars build on the work of past thinkers; arts are easier to learn when one has a teacher. We benefit from other’s advice and teaching. Yet, as Plato points out in his sketch of “the tyrannical man” in Book IX of the Republic, a person ruled over by the “lower parts of the soul,” is likely to disregard advice that they find disagreeable, since they are not motivated by a desire for truth.1 Good learners can cooperate, something that generally requires not being ruled over by appetites and emotions. They take time to understand others’ opinions and can consider them without undue bias.
By contrast, consider the doctor who ignores the good advice of a nurse because the nurse lacks his credentials. The doctor is allowing honor — the prerogative of the spirited part of the soul — to get in the way of discovering the truth. Likewise, consider the scientist who falsifies her data in order to support her thesis. She cares more about the honor of being seen to be right than actuallybeing right, or perhaps she is more motivated by book sales, which allow her to satisfy her appetites, than she is in producing good scholarship. It is not enough that reason is merely engaged in learning. Engagement is certainly necessary, as the rational part of the soul is the part responsible for all learning and the employment of knowledge. Yet the rational part of the soul must also rule over the other parts, blocking out inclinations that would hinder the the search for truth.
Prior to reading "After Virtue", I don't think I could have defined 'telos'. How does one land on the premise of a human telos, today? Is it simply moral pragmatism? Is 'excellence' fundamental to the premise of telos?
I think Adorno would agree that reason needs to broken free of rigid frameworks, but this is reason's way of correcting itself, not an irrationalist rebellion.
The sun rises every morning. I do not rise every morning, but the variation is due not to my activity, but to my inaction. Now, to put the matter in a popular phrase, it might be true that the sun rises regularly because he never gets tired of rising. His routine might be due, not to a lifelessness, but to a rush of life. The thing I mean can be seen, for instance, in children, when they find some game or joke that they specially enjoy. A child kicks his legs rhythmically through excess, not absence, of life. Because children have abounding vitality, because they are in spirit fierce and free, therefore they want things repeated and unchanged. They always say, “Do it again”; and the grown-up person does it again until he is nearly dead. For grown-up people are not strong enough to exult in monotony. But perhaps God is strong enough to exult in monotony. It is possible that God says every morning, “Do it again” to the sun, and every evening, “Do it again” to the moon. It may not be automatic necessity that makes all daisies alike; it may be that God makes every daisy separately, but has never got tired of making them. It may be that He has the eternal appetite of infancy; for we have sinned and grown old, and our Father is younger than we. The repetition in Nature may not be a mere recurrence; it may be a theatrical encore.
That is, if you can show how psychological or economic models (for example) fail to offer consistently, predictable results, then that counts for me as a substantive blow against positivism as opposed to just an analytic attack on the self consistency of the theory.
However, historically, the "new Baconian science," the new mechanistic view of nature, and nominalism pre-date the "Great Divergence" in technological and economic development between the West and India and China by centuries. If the "new science," mechanistic view, and nominalism led to the explosion in technological and economic development, it didn't do it quickly. The supposed effect spread quite rapidly when it finally showed up, but this was long after the initial cause that is asserted to explain it.
Nor was there a similar "great divergence," in technological progress between areas dominated by rationalism as opposed to empiricism within the West itself. Nor does it seem that refusing to embrace the empiricist tradition's epistemology and (anti)metaphysics has stopped people from becoming influential scientific figures or inventors. I do think there is obviously some sort of connection between the "new science" and the methods used for technological development, but I don't think it's nearly as straightforward as the empiricist version of their own "Whig history" likes to think.
In particular, I think one could argue that technology progressed in spite of (and was hampered by) materialism. Some of the paradigm shifting insights of information theory and complexity studies didn't require digital computers to come about, rather they had been precluded (held up) by the dominant metaphysics (and indeed the people who kicked off these revolutions faced a lot of persecution for this reason).
By its own standards, if empiricism wants to justify itself, it should do so through something like a peer reviewed study showing that holding to logical positivism, eliminativism, or some similar view, tends to make people more successful scientists or inventors. The tradition should remain skeptical of its own "scientific merits" until this evidence is produced, right? :joke:
I suppose it doesn't much matter because it seems like the endgame of the empiricist tradition has bifurcated into two main streams. One denies that much of anything can be known, or that knowledge in anything like the traditional sense even exists (and yet it holds on to the epistemic assumptions that lead to this conclusion!) and the other embraces behaviorism/eliminativism, a sort of extreme commitment to materialist scientism, that tends towards a sort of anti-philosophy where philosophies are themselves just information patterns undergoing natural selection. The latter tends to collapse into the former due to extreme nominalism though.
Pluralism, as I understand it, allows different epistemological perspectives, with different conceptions of what is true within those perspectives.
It also encourages discussion between perspectives, including how conceptions of truth may or may not converge.
Relativism (about truth) would deny even this perspectival account as incoherent. (A very broad-brush picture of a hugely complicated subject, of course.)
What is it the critic wants to conclude - that our use of the word is grounded in a pre-linguistic understanding of what water is? Perhaps we learn to drink and wash before we learn to speak. But learning to drink and wash is itself learning what water is. There is no neat pre-linguistic concept standing behind the word, only the way we interact with water as embodied beings embedded in and interacting with the world. Our interaction with water is our understanding of wate
So on one hand we have a triadic {water – concept-of-water – use of water}; on the other just water being used.
You and Tim objecting to formal modal logic robs you both of the opportunity to present your arguments clearly.
The suggestion that formal logic is restricted to analytic philosophy is demonstrably ridiculous
What should I do? Is it OK for me to just shoot you, in order to eliminate dissent? Should I do what the One True Explanation of Everything demands, even if that leads to abomination?
Funny, how here we are now moving over to the ideas entertained in the thread on Faith. I wonder why.
You know, those basic liberal virtues. How much worse would a world be in which only the One True Explanation Of Everything was acceptable, uncriticised?
Pluralists can accept many truths within different practices - physics, literature, religion, without affirming logical contradictions. But this doesn’t mean that "2+2=5" and "2+2=4" are both true. Pluralism has limits, governed by coherence, utility, and discursive standards.
I think this a much more wholesome response than supposing that some amongst us have access to the One True Explanation and the One True Logic.
Misology is not best expressed in the radical skeptic, who questions the ability of reason to comprehend or explain anything. For in throwing up their arguments against reason they grant it an explicit sort of authority. Rather, misology is best exhibited in the demotion of reason to a lower sort of "tool," one that must be used with other, higher goals/metrics in mind. The radical skeptic leaves reason alone, abandons it. According to Schindler, the misolog "ruins reason."
If we return to our caricatures we will find that neither seems to fully reject reason. The fundamentalist will make use of modern medical treatments and accept medical explanations, except where they have decided that dogma must trump reason. Likewise, our radical student might be more than happy to invoke statistics and reasoned argument in positioning their opposition to some right wing effort curb social welfare spending.
Where reason cannot be trusted, where dogma, or rather power relations or pragmatism must reign over it, is determined by needs, desires, aesthetic sentiment, etc. A good argument is good justification for belief/action... except when it isn't, when it can be dismissed on non-rational grounds.In this way, identity, power, etc. can come to trump argument. What decides when reason can be dismissed? In misology, it certainly isn't reason itself.
In that sense, our version of reality or truth functions similarly to how language works; it doesn’t have a grounding outside of our shared conventions and practices.
The position isn’t that truth is mere popularity, but that truth is built through ongoing conversation and agreement. What counts as true is what survives criticism, investigation, and revision within a community over time. So instead of certainty, we have a fallible and evolving consensus. Tradition, in such a context, is something that should be investigated and revised if necessary.
Humans work to create better ways to live together,
We settle, at least for a while, on what works
having conversations about improvement,
I think you'd see, rereading, that this isn't accurate.


Yes, in a way, but I think reality comes first. I think we have to have some familiarity with water before we have any sensible familiarity with "water."
I am not a relativist about truth
Of course, so much so that I'd hesitate to talk about "truths" here at all. Or maybe I don't understand what a non-context-dependent truth about a philosophy would be.
Nor do I think that acknowledging "pluralistic, context-dependent truths" makes someone a relativist.
Subject to certain purposes, you might say.
what we can point to is broad agreement,
shared standards
and better or worse outcomes within a community or set of practices.
The mistake comes when we think we've consulted the Philosophical Dictionary in the Sky and discovered what is Really Real.
What I got from @Banno seems to be that pluralistic or context-based truths don’t mean that every contradiction is true. Instead, truths depend on the situation, purpose, or point of view.[/quote
Of course. Just the ones that are useful to affirm are "true"... and "false." Maybe neither too. Perhaps in the interest of greater tolerance we shall proclaim in this case that there both is and is-not a One True Truth (TM)?
But that doesn't really seem to work. To say "is" and "is-not" here is really just to deny "is." Yet can it be "wrong" to affirm the "One True Truth" in this case?
When contradictions happen, it usually means they come from different ways of looking at things -not that truth doesn’t exist
I imagine you’re unlikely to be a Rorty fan, but didn’t he say that truth is not about getting closer to some metaphysical reality; it’s about what vocabularies and beliefs serve us best at a given time?
Well it may well be useful for one's survival to accept that Big Brother is right, so at one level (that of ruthless pragmatism) sure. But being compelled to believe something out of fear of jail or death is a different matter altogether, isn't it?
Like Macbeth, Western man made an evil decision, which has become the efficient and final cause of other evil decisions. Have we forgotten our encounter with the witches on the heath? It occurred in the late fourteenth century, and what the witches said to the protagonist of this drama was that man could realize himself more fully if he would only abandon his belief in the existence of transcendentals. The powers of darkness were working subtly, as always, and they couched this proposition in the seemingly innocent form of an attack upon universals. The defeat of logical realism in the great medieval debate was the crucial event in the history of Western culture; from this flowed those acts which issue now in modern decadence.
IS there some conclusion that you would like to draw from all this?
I agree whole-heartedly that the notion that one has grasped an Absolute Truth is extremely dangerous. It makes it impossible to acknowledge and tolerate any disagreement. I cannot think of a situation in which this might be a a Good Thing, but I can think of many in which it is clearly a Bad Thing.
Isn't your take informed by a bias that values traditionalism and is suspicious, perhaps even hostile towards political radicalism (particularly of the Left)? Is your use of irony as Rorty uses it? Is 'unseriousness' how they would describe it, or is that your description for it? There's a further quesion in what counts as a politically radical circle?
You'd imagine this is fairly common today. Why do you find this more pernicious?
Custom may have once served a purpose, Mill acknowledges—in an earlier age, when “men of strong bodies or minds” might flout “the social principle,” it was necessary for “law and discipline, like the Popes struggling against the Emperors, [to] assert a power over the whole man, claiming to control all his life in order to control his character.”9 But custom had come to dominate too extensively; and that “which threatens human nature is not the excess, but the deficiency, of personal impulses and preferences.”10 The unleashing of spontaneous, creative, unpredictable, unconventional, often offensive forms of individuality was Mill’s goal. Extraordinary individuals—the most educated, the most creative, the most adventurous, even the most powerful—freed from the rule of custom, might transform society.
“Persons of genius,” Mill acknowledges, “are always likely to be a small minority”; yet such people, who are “more individual than any other people,” less capable of “fitting themselves, without hurtful compression, into any of the small number of moulds which society provides,” require “an atmosphere of freedom.”11 Society must be remade for the benefit of this small, but in Mill’s view vital, number. A society based on custom constrained individuality, and those who craved most to be liberated from its shackles were not “ordinary” people but people who thrived on breaking out of the customs that otherwise governed society. Mill called for a society premised around “experiments in living”: society as test tube for the sake of geniuses who are “more individual.”
We live today in the world Mill proposed. Everywhere, at every moment, we are to engage in experiments in living. Custom has been routed: much of what today passes for culture—with or without the adjective “popular”—consists of mocking sarcasm and irony. Late night television is the special sanctuary of this liturgy. Society has been transformed along Millian lines in which especially those regarded as judgmental are to be special objects of scorn, in the name of nonjudgmentalism. Mill understood better than contemporary Millians that this would require the “best” to dominate the “ordinary.” The rejection of custom demanded that society’s most “advanced” elements have greater political representation. For Mill, this would be achieved through an unequal distribution of voting rights...
Society today has been organized around the Millian principle that “everything is allowed,” at least so long as it does not result in measurable (mainly physical) harm. It is a society organized for the benefit of the strong, as Mill recognized. By contrast, a Burkean society is organized for the benefit of the ordinary—the majority who benefit from societal norms that the strong and the ordinary alike are expected to follow. A society can be shaped for the benefit of most people by emphasizing mainly informal norms and customs that secure the path to flourishing for most human beings; or it can be shaped for the benefit of the extraordinary and powerful by liberating all from the constraint of custom.
Thinking through this question now -- Kuhn's Structure of Scientific Revolutions is what I have in mind, but with a more materialist mindset which doesn't give into the notion that nature itself changes with the sciences.
I tend to favor the epistemic side over the ontology side -- I understand it's basically a "player's choice", but it's my preference. On the reverse of "How do you know unless you start with what is?" is "How do you know what is unless you start with what you know?"
I think it's in virtue of the things our species relies upon water for -- drinking, cooking, bathing, etc.
There has never been a nominalist, or rather, individualist country
No, the fact that something looks like a human being makes something a "human being".
It means that everything we know about human beings is derived from the senses and experience.
It's more like "realism is false because no one can find universal or abstract object". One of the common objections from nominalism against realism is that forms and universals and abstract objects cannot be found.
Simply that he looks like other human beings.
One thing is for certain, we are not developing these general ideas by looking at forms and essences.
The notion that one attaches and removes dignity to terms and definitions in order to dignify a human being is precisely the threat that I’m talking about. When one dehumanizes, like calling people rats for example, nothing at all changes in any individual human being outside the realist skull, but his treatment of them certainly does.
But if someone kills another for some the sake of some name like “country” or “God”, then we have an instance of destroying what is boundlessly more valuable for the sake of an idea or figment.
Before Copernicus there was overwhelming evidence of the spheres having and will always being in existence.
I think the underdetermination argument is what undermines this notion -- it's what I'd guess now, but it could be that we're reading patterns into the past that we accept now which are predictive and make sense,
