• Emotions Are Concepts
    Imagine training machines instead of trying to program them.praxis

    :ok:

    The price of the neural network revolution was giving up (or at least severely compromising) the model of the brain (or computer) as a processor of stored symbols - internal words and pictures representing external objects. Ironically, it had to revert to Skinner's behaviourist model, a "black box". Training, without necessarily understanding the learning.bongo fury

    (Admittedly off-topic, except possibly as regards the question of internal (or external) words and pictures representing internal processes.)
  • "1" does not refer to anything.
    So... you are asking what I think Pneumenon meant that Wittgenstein meant atBanno

    No! Only whether the word-string "then we can get extensions for it" was a misprint of "then we can get infinite extensions for it"?

    A different reading of it (as not a misprint) seemed plausible, so I thought I should check.

    And ...?
  • "1" does not refer to anything.
    Did "then we can get extensions from it" mean "then we can get infinite extensions from it"?
  • "1" does not refer to anything.
    ↪bongo fury So you want to argue thatBanno

    One thing at a time?

    If the rule allows to construct a finite extension, then we can get extensions from it, too.
    — Pneumenon

    This is the bit that I've been unable to find clearly articulated.
    — Banno

    Just to be clear, are you both dropping (or taking as read) an "infinite"?
    bongo fury

    To which,

    ↪bongo fury There are infinities.Banno

    Ought that have clarified for the competent reader that @Pneumenon meant "then we can get infinite extensions from it, too"?

    Just hoping not to misunderstand either one of you.
  • "1" does not refer to anything.
    I wouldn't say that any of these expressions point to or are about anything (in the sense of reference). They may indicate something, but that's not quite the same thing –Michael

    But if we reflect that what a word refers to or is pointed at is never a matter of fact anyway, but is one rather of interpretation, theoretical parsimony then strongly argues against the easy option of distinguishing as many varieties of meaning as we might have different words for. Obviously no two of these kinds will ever be quite the same thing.

    The argument isn't just about theoretical desiderata and separate from the subject-matter: the behavioural interactions we are discussing depend on agents' anticipations of each others' interpretations, so we are theorising about theorising (about...).

    And so I applaud @Harry Hindu's objection here to the habitual distinction of expression and exhortation from description. My attempt here.

    To expand a little: since no bolt of energy (nor any more subtle physics) connects uttered word to object, we (interlocutor or foreign linguist or even utterer) are perhaps entitled and perhaps required to interpret the utterance as pointing, in various degrees of plausibility, not only a presently uttered token but also the "word as a whole" at (not only a present object but) some kind as a whole, and then by implication as also pointing not-presently-uttered but semantically related words at related objects and kinds. In other words, any speech act offers a potential adjustment (or entrenchment) of the language in use, so that the extensions of related words are shifted in related ways.

    Hence utterances that vent frustration can also offer (directly or indirectly) potential adjustments to the extensions of words ("patient", "skilful" etc.) that might or might not point at Michael.

    And hence also Harry's and my other examples as linked above.
  • "1" does not refer to anything.
    If the rule allows to construct a finite extension, then we can get extensions from it, too.
    — Pneumenon

    This is the bit that I've been unable to find clearly articulated.
    Banno

    Just to be clear, are you both dropping (or taking as read) an "infinite"?
  • "1" does not refer to anything.
    we pretend that integers are real things, and this leads us on to more complex ways of talking about integers, and so a sort of recursion allows us to build mathematics up from... nothing.Banno

    Sure, maths as fiction with a super-coherent plot.

    And with illustrations, too. Kind of, Alice in Wonderland.
  • Lack of belief vs active disbelief
    What is the probability of the invisible miniature dolphin's existence?Pneumenon

    Please tick one:

    • Zero or negligible
    • Non-negligible but doubtful
    • Certain, or negligible degree of doubt
  • "1" does not refer to anything.
    Rather "1" is to be understood through its role inBanno

    ... in whatever the discourse. Charity and world-domineering ambition alike require translation between discourses, and agreement re ontological commitment (re, e.g., what "1" refers to). Pedagogy and practicalities require, instead, toleration of alternative systems.
  • "1" does not refer to anything.
    One cannot physically list the integers. But in understanding the intension of "integer" we understand how to construct the extension... and in so doing it seems to me that we understand the extension to be infinite.Banno

    If in fact the concrete world is finite, acceptance of any theory that presupposes infinity would require us to assume that in addition to the concrete objects, finite in number, there are also abstract entities. [...]

    Apart from those predicates of concrete objects which are permitted by the terms of the given problem to appear in the definiens, nothing may be used but individual variables, quantification with respect to such variables, and truth-functions. Devices like recursive definition and the notion of ancestral must be excluded until they themselves have been satisfactorily explained.
    — Goodman and Quine, Steps Toward a Constructive Nominalism
  • "1" does not refer to anything.
    I needed a mathematician who might disagree with a constructivist approach to mathematics;Banno

    What, to explain,

    why does Wittgenstein's constructivism lead to finitism?Banno

    ? :chin:
  • Of Vagueness, Mind & Body
    Sure, and many people here in this thread have made the obvious but important point that...

    1. Given the brain has a digital structure (on/off neurons)TheMadFool

    is a contestable premise. But also...

    Yes, let's agree on that, for the sake of argument; or we could (equivalently) discuss the "perceptions" and "thoughts" of the computer.bongo fury

    Because then,

    how is it that it generates vague concepts?TheMadFool

    is a good question, requiring but potentially also stimulating clarification of the terms in play. One can but hope.

    The sorites paradox then [...] reveals, quite clearly in my opinion, that if the brain operates in discrete digital brain states [...] then, vagueness should be impossibleTheMadFool

    ... or will look curious and paradoxical, sure. I recommend it (the paradox) as an anvil on which to refashion your ideas about this topic. Equally,

    How can a digitally produced image show a foggy day?Tim3003

    Either way, try to learn to bash the ideas one at a time.
  • Of Vagueness, Mind & Body


    The surprising thing about a sorites puzzle, or indeed any discussion of vagueness, is you don't need any analog. Analog is sufficient but not necessary for vagueness. You only require a digitally defined series (of discrete but plausibly imperceptibly different values, like your heights to 3 d.p.) and two or three vague adjectives like tall, medium and short.
  • Definitions
    As part of a balanced regimen of expository etiquette, including regular glossing, defining of terms has been shown to visibly reduce misunderstandings, and underlying spiritual growth. Haha. And now, the science bit:

    Definitions can be beneficial when used to join up separate discourses of any kind, large or small: large, e.g. languages in use, conceptual schemes, paradigms, ideologies, disciplines; and small, e.g. beliefs, dialogues, theories etc.

    (Ouch, I might have glossed a bit hard there).

    Where they meet most resistance is probably where they are perceived as the possible Trojan horse of an untrustworthy power?
  • Of Vagueness, Mind & Body
    1. If the brain is digital then each perception and each thought corresponds to a specific combination of off/on neurons which I will call brain state.TheMadFool

    Yes, let's agree on that, for the sake of argument; or we could (equivalently) discuss the "perceptions" and "thoughts" of the computer.

    A dwarf evokes a brain state and a giant, as of necessity, mustTheMadFool

    , for the sake of argument, might

    elicit a different brain state for they're two different perceptions and also different thoughts; different perceptions, different brain states and different thoughts, different brain states.

    Tallness is a vague term - it applies not to a specific height but to a range of possible values height can assume.
    TheMadFool

    That merely makes it a general term, no?

    That means heights of 6.1 ft, 6.5 ft, 7 ft are all tall for this person. What is to be noted here is that each of these heights are distinct perceptions and should evoke distinct brain states and each of these brain states should be different thoughts but this isn't the case: all of the heights 6.1 ft, 6.5 ft and 7 ft are matched to not different but the same brain state, the same thought, the thought tall.TheMadFool

    Ditto. A brain event might point its "6.1 ft" state at several different persons, and the same event might instantiate also a "tall" state which it points at these and other persons. (Fans of brain state talk may prefer "correlate with" to "point at".) Another instantiated state, "John", might be functioning as a singular term, and "6.1 ft" and "tall" as both general. Or "6.1 ft" and "6.5 ft" might be singular terms each pointing at a singular thing or value while "tall" points at both values and doubtless others.

    This shouldn't be possible if each brain state is a different thought, no?TheMadFool

    Even if you insisted (as some might but I certainly wouldn't) that no word token has the same reference as another token of the same word, that still wouldn't prevent each of them from referring generally to a host of things or values. (Likewise for states and unique brain events as for words and unique tokens.)

    In other words, a digital brain with thoughts being discrete brain states shouldn't be able to generate/handle vague concepts because if it could do that it implies different brain states are not different but the same thought.TheMadFool

    Again, would you substitute "general" for "vague", here? And if not, why not? Either way, this is a point worth debating, but I think it is about generality not vagueness.

    2. Imagine a digital and an analog voltmeter (measures voltage). The analog voltmeter has a dial and is able to read any continuous voltage but the digital voltmeter reads only discrete values such as 0, 1, 2, and so on. Now, the digital voltmeter's measuring involves rounding off voltages and so anything less than 0.5 volts it reads as 0 and anything between 0.5 and 1.5 volts it reads as 1 volt and anything between 1.5 volts and 2.5 volts it reads as 2 volts and so on. The digital voltmeter assigns a range of continuous voltages to a discrete reading that it can display.This is vagueness.TheMadFool

    Yes it entails vagueness if the discrete values are assumed to represent all possible voltages, but usually no, because margins of error are implied in the digitizing which make it feasible to prevent 0 from overlapping with 1, etc. Hence the inherent fidelity of digital reproduction, which amounts to spell checking.

    This seems to suggest that vagueness is an aspect of digital systemsTheMadFool

    It's always an issue lurking at the borders (margins if you like) of the error-margins, and when considering the whole of the digitizing (or reverse) process.

    and so, the brain, understood as functioning in discrete brain states (digitally), should generate vague concepts.TheMadFool

    Probably. Depends how you clarify that. Read 's question, and I recommend also this answer.

    1 & 2 seem to contradict each other.TheMadFool

    With appropriate revisions we hope not. :smile:
  • Of Vagueness, Mind & Body
    How can a digitally produced image show a foggy day?Tim3003

    :ok:
  • Of Vagueness, Mind & Body
    1. Given the brain has a digital structure (on/off neurons) how is it that it generates vague concepts?TheMadFool

    Leaving my better :razz: question aside, and going down your road of assuming brain states to be elements of a digital syntax, which I must admit is a respectable road for many people... are you asking whether: the evident vagueness of the (ahem) reference by (I can't believe I'm saying this) those states to external stimuli shows that the repertoire of states and their correlation with stimuli must be encompassed by some larger, inclusive analog syntax and/or semantics?
  • Of Vagueness, Mind & Body
    My main concern is simply if a digital system can handle analog data.TheMadFool

    Ok, but data are only digital or analog relative to system. They are symbols or values that are digital when interpreted as elements of a discrete syntax or semantics, and analog when interpreted as elements of a continuous syntax or semantics.

    E.g. Goodman's example (above): any letter 'A' inscription is digital read as a token of the letter-type 'A', but analog read as one of an unlimited repertoire of distinct 'A' types, so that vagueness is entailed, at least eventually, by the uncapped expectation of finer and finer discrimination between one element and another.

    So no, a digital system is one that handles data digitally, while a different, 'larger' system may encompass the first one and treat the same data as analog.
  • Of Vagueness, Mind & Body
    let's venture into the digital, a worldTheMadFool

    Better, a symbol system...

    lacking the qualities of a continuumTheMadFool

    Although it might be an excision from a 'larger' analog system. (Here, page 126, though it may not show. Can't find a pdf.)

    This basic signalling architecture (on/off) suggests the brain is like a computer, digital.TheMadFool

    Yes (in the important respect you specify), but probably not a symbolic computer, processing symbols stored in a memory. Rather, a machine that can be trained to respond to stimuli in a systematic way. http://www.alanturing.net/turing_archive/pages/Reference%20Articles/what_is_AI/What%20is%20AI10.html

    However, the mind has created what is probably a huge cache of vague conceptsTheMadFool

    Do you mean a cache of symbols stored in a memory, to be accessed and read, as in a symbolic computer? I think the connectionist model largely contradicts that (still prevalent) notion.

    1. Given the brain has a digital structure (on/off neurons) how is it that it generates vague concepts?TheMadFool

    So, better to ask, how is it that it learns to participate in language games of pointing actual (not mental) words and pictures at things (in vague ways).

    2. Does the existence of vague concepts imply that the analog mind is not the same asTheMadFool

    So, imv no, it doesn't imply anything about concepts contained in a mind or a brain, because I cash out "concepts" in terms of symbols, and I don't see any need to locate them in the head.

    Perhaps you could clarify how you meant us to read "concepts"?... If not as internal symbols, which I took to be implied by "cache".
  • Definitions
    Time for an appropriate joke:Frank Apisa

    :rofl: Don't worry, , we got yours too. (we did ??!)
  • The Epistemology of Visual Thinking in Mathematics
    Does it differ significantly from fig 1?

    I can't see how.
    Banno

    Me neither, in respect of whether it counts as a 'real' proof. Isn't this a live area? I thought that article was going to update me, but it wasn't quite that area. But I wouldn't be about to try persuading you otherwise.

    saying is a complicated way of showing.Banno

    In what way complicated? Is the justifying/demonstrating/showing perhaps more complicated than the exemplifying/showing?
  • The Epistemology of Visual Thinking in Mathematics
    It states a sentence, as a conclusion, and it exemplifies ("shows") a pattern of inference. Which is justifying ("showing") the conclusion, but not justifying ("showing") the pattern of inference. (?)
  • The Epistemology of Visual Thinking in Mathematics
    Does this say or show?Banno

    It exemplifies the pattern, yes? So, shows an instance.
  • The Epistemology of Visual Thinking in Mathematics
    Isn't understanding [...] a kind of discovery?Banno

    Wasn't that the credo of Dewey et al, yes?
  • The Epistemology of Visual Thinking in Mathematics
    showing and stating.Banno

    Proving and discovering?
  • The Epistemology of Visual Thinking in Mathematics
    I'm more interested here in the distinction between showing and stating.Banno

    Ok. :grin:
  • The Epistemology of Visual Thinking in Mathematics
    And if someone does not see it thus, but sees it so...

    ...then it's not a justification at all.
    Banno

    And this couldn't as easily happen with words? That the person doesn't hear/think them thus but hears/thinks them so?

    for proving or following a proof the subject must be aware of the way in which the conclusion is reached and the soundness of that way; — Giaquinto

    (The quote is from a contrast of proof with discovery, but if I wasn't mistaken this was meant to correlate roughly with that of words with pictures.)

    Again, I'm not seeing any important difference between words and pictures. I'm just as likely to follow a textual proof uncomprehendingly or inappropriately as a visual one, am I not?

    And an automatic prover would (in principle) be indifferent as regards type of symbols, would it not?
  • Definitions
    they simply indicate how they are most often used.

    Sometimes they even get that wrong!
    Frank Apisa

    If only there were a fact of the matter, to be right and wrong about... A population of word-use events, from which to sample appropriately.

    And if it weren't for pesky kids like Humpty Dumpty, Quine (Gavagai), and Chomsky (probability of an utterance)...
  • Can one provide a reason to live?
    Once one is dead, one is indifferent to such event,JacobPhilosophy

    But also unable to increase the sum of human happiness, which one has almost certainly just measurably reduced.
  • Generalization
    Pointing a word (or other symbol) at more than one thing.
  • Thinking-of, Thinking-for, Thinking-with.
    Don't underestimate the move of re-situating ("casting the net-wider") - it has retroactive effects that modify the apparently 'local' as well. It changes the significance of 'thinking-of', and all one would like to associate it with.StreetlightX

    Sure, we hope it will. I just said don't assume that, and don't perpetuate too many myths about the supposedly narrower question in the process.

    Carry on :clap:
  • Thinking-of, Thinking-for, Thinking-with.
    I won't say too much about this because it should be pretty familiar.StreetlightX

    And coherent? That local problem solved, now cast the net wider? That move always disappoints me. Always cast wider, sure, but don't assume we've drilled deep enough, or that drilling deeper won't, and casting wider will, effect a better map of the territory. In this case, don't further entrench all of the mythology of subject/object, of mental words and pictures, albeit circumscribing it.

    Of course I can hate the recipe but love the pudding, of which the proof will be in the eating. Thanks for the book recommendation. And the thread.
  • What the Tortoise Said to Achilles
    Also you [@Nagase] seem to assume that if a rule goes from true beliefs to true beliefs, it is justified.83nt0n

    Agreed.

    Once again, this is using modus ponens to prove modus ponens.83nt0n

    I disagree. It might just be recognising soundness as a self-evident virtue. Another modus ponens (aside from the one being justified) needn't be involved. You were just on a roll with that objection, no? It is the tortoise's expected refrain, true, but the tortoise doesn't talk about this combination, wherein the student accepts A, B and Z (from true to true) but not C (the rule). The tortoise invites us to justify Z on the basis of A and B, and then of course he claims to need C (and then D etc).

    I only mention this in case it connects the tortoise's problem to the alleged 'scandal' of deduction: of its telling us no more than we already knew; of soundness being an empty (as well as self-evident) virtue. If Z does indeed follow from A and B as C claims, then C goes without saying. So much then for,

    “Whatever Logic is good enough to tell me is worth writing down,” said the
    Tortoise.
  • What the Tortoise Said to Achilles
    I'm curious to see what some other people think about this.83nt0n

    Me too: https://thephilosophyforum.com/discussion/comment/377693

    I'm actually tempted to call it the 'problem of deduction'.83nt0n

    Different to this, though?

    https://www.semanticscholar.org/paper/The-Enduring-Scandal-of-Deduction-Is-Propositional-D'Agostino-Floridi/6ff51e3f704044fac00b2c7430cf1ac775283820

    Yes, I think so.
  • If women had been equals
    Male and female brains are “wired” differently,NOS4A2

    Undeniable. (I'm guessing.)

    “Male brains are structured to facilitate connectivity between perception and coordinated action, whereas female brains are designed to facilitate communication between analytical and intuitive processing modes”.NOS4A2

    Decade-specific fantasy. (I'm guessing.)
  • Metaphilosophy: Historic Phases
    Love jazz, hate jazz-ism - the assumption that there are this and that musical natures or essences, and we should care whether this instance or kind is part of that kind (except insofar as doing so does happen to enhance musical appreciation; but thinking the kinds are natural isn't going to make that more likely).

    Love philosophy, hate philosoph-ism, the assumption that we should care whether this instance or kind of thinking should be considered part of that.

    Love critiques of essentialism...