• TheMadFool
    13.8k
    Vagueness of a term means that it's difficult to precisely locate when the term applies or not e.g. tall and fat are vague terms because people will disagree, for the reason that these lack a clear-cut meaning, on who's tall or fat.

    I'm not completely sure about this but the notion of a continuum as something whose parts are indistinguishable from each other seems to be appropriate to describe vagueness. In scientific terms this is comparable to something being analog.

    That out of the way, let's venture into the digital, a world lacking the qualities of a continuum and each part is discrete enough to be clearly discerned to be different from other parts.

    If I remember my human physiology classes, the language of neurons, the cells that make up the brain, is an action potential which follows the all or none law i.e. either the neuron is firing or it isn't. This basic signalling architecture (on/off) suggests the brain is like a computer, digital.

    However, the mind has created what is probably a huge cache of vague concepts a couple of which I mentioned upstream.

    My questions are:

    1. Given the brain has a digital structure (on/off neurons) how is it that it generates vague concepts?

    2. Does the existence of vague concepts imply that the analog mind is not the same as the digital brain i.e. is the mind not the brain?
  • bongo fury
    1.6k
    let's venture into the digital, a worldTheMadFool

    Better, a symbol system...

    lacking the qualities of a continuumTheMadFool

    Although it might be an excision from a 'larger' analog system. (Here, page 126, though it may not show. Can't find a pdf.)

    This basic signalling architecture (on/off) suggests the brain is like a computer, digital.TheMadFool

    Yes (in the important respect you specify), but probably not a symbolic computer, processing symbols stored in a memory. Rather, a machine that can be trained to respond to stimuli in a systematic way. http://www.alanturing.net/turing_archive/pages/Reference%20Articles/what_is_AI/What%20is%20AI10.html

    However, the mind has created what is probably a huge cache of vague conceptsTheMadFool

    Do you mean a cache of symbols stored in a memory, to be accessed and read, as in a symbolic computer? I think the connectionist model largely contradicts that (still prevalent) notion.

    1. Given the brain has a digital structure (on/off neurons) how is it that it generates vague concepts?TheMadFool

    So, better to ask, how is it that it learns to participate in language games of pointing actual (not mental) words and pictures at things (in vague ways).

    2. Does the existence of vague concepts imply that the analog mind is not the same asTheMadFool

    So, imv no, it doesn't imply anything about concepts contained in a mind or a brain, because I cash out "concepts" in terms of symbols, and I don't see any need to locate them in the head.

    Perhaps you could clarify how you meant us to read "concepts"?... If not as internal symbols, which I took to be implied by "cache".
  • Gnomon
    3.8k
    1. Given the brain has a digital structure (on/off neurons) how is it that it generates vague concepts?

    2. Does the existence of vague concepts imply that the analog mind is not the same as the digital brain i.e. is the mind not the brain?
    TheMadFool
    I can't give you the answer you are looking for. But I can speculate on how a digital process can produce analog outputs. Here's a quick sketch.

    First, the brain is not purely digital. Although we think of neurons firing off & on as a digital process, the storage of that data is not digital. Instead, it is relational in a complex context. Nevertheless, everything physical is ultimately quantized into discrete on-off, positive-negative dichotomies. But when we view a cluster of microscopic objects at the macro level it appears to be continuous. That's because the mind tends to average zillions of data inputs into unified whole objects. What we call "Mind" is not an object itself, but a process of conceptualizing manifold sensory data into singular subjective meanings relative to the unitary self. [that statement may take several readings before it makes sense]

    Basically, physical Objects (e.g brain) are quantum mechanical systems composed of discrete particles and quanta of energy. But the meta-physical Mind is holistic and continuous, because it's not a thing, but a concept, an idea. A mental concept is necessarily holistic, because when broken into parts, it loses its integrity, its meaning.

    As an analogy, think of the brain as a flashing light bulb. When the frequency of flashing exceeds your meat brain's capacity to discriminate, the light will be conceptualized as continuous. Similarly, zillions of data bits stored in the brain, are reconstituted and recalled as analogous to the original unitary object. Although the concept must remain integrated, we can mentally dial it up or down so that the continuous light appears brighter or darker. In other words, we can mentally evaluate the data to suit our subjective purposes, while remaining relatively true to objectivity. Even vague concepts retain enough of their original identity, to be identified with the original object.

    So, yes, the Mind is not the Brain, but our averaged-out concept of what the brain does --- ignoring the minute digital details in favor of the whole continuous concept.

    I'd better quit while I'm behind. Because this vague reasoning could quickly fade into babbling nonsense. :cool:


    Brain analog or digital? : Information in the brain is represented in terms of statistical approximations and estimations rather than exact values.
    https://www.forbes.com/sites/quora/2016/09/27/is-the-human-brain-analog-or-digital/#26dbf1867106

    Meta-physics : Physics refers to the things we perceive with the eye of the body. Meta-physics refers to the things we conceive with the eye of the mind. Meta-physics includes the properties, and qualities, and functions that make a thing what it is. Matter is just the clay from which a thing is made. Meta-physics is the design (form, purpose); physics is the product (shape, action).
    http://blog-glossary.enformationism.info/page14.html
  • TheMadFool
    13.8k
    Sorry, couldn't catch what you were getting at. My main concern is simply if a digital system can handle analog data.

    Kindly have a look at the following.

    You agree that the brain is digital in construction - neurons are either off/on. At the very least, we could then infer that if the brain has a language, we've figured out the alphabet of this language (off/on sequences of neurons)

    We also know that we have vague concepts like tall and fat. If we accept that the brain speaks in a digital language then these vague concepts must be translatable into a digital code of off/on neuronal states.

    Vagueness is, as relevant to this discussion, an analog feature. When an analog paramater may take on any value we have a continuum. In a continuum the differences between values close together are imperceptible and that's the key feature of vague concepts - a vague concept may apply over a range of values that have very tiny (imperceptible) differences between them.

    Now, it must be the case that vague concepts too should be translatable into the brain's digital language of off/on neuronal states. Each combination of neuronal states is discrete from the next - we can tell them apart very easily. If that's the case then consider the concept of height - one combination of discrete neuron states must correspond to incontrvertible gigantism (say heights at or above 7 ft) and another combination of neuron states is incontrovertible dwarfism (say less than 4 ft). There is no vagueness in either gigantism or dwarfism.

    Tallness lies between gigantism and dwarfism and is vague. Yet, tallness must have a height value fixed to it as a combination of neuronal states. Since tallness is vague it must mean that not one but many heights are assigned to the same combination of neuronal states and thus we think "this person is tall", one specific combination of neuronal states obtains, whenever any of the heights that we've assigned to tall is observed.

    This, however, is a problem for the reason that each combination of neuronal states must constitute a distinct thought; it can't be that one specific combination (a single thought) refers to two different perceptions. I mean seeing a giant and a dwarf can only differ in terms of the specific combination of neuronal states; ergo, seeing two different heights, two different perceptions, shouldn't both correspond to the same combination of neuronal states , here the same thought of tallness. Vagueness, vague concepts, are an impossibility for a digital brain for it amounts to saying two different perceptions (two different heights) produce the same brain state, the same thought (they're both tall). Perhaps then, the mind, being capable of vagueness is different from the brain.
  • bongo fury
    1.6k
    My main concern is simply if a digital system can handle analog data.TheMadFool

    Ok, but data are only digital or analog relative to system. They are symbols or values that are digital when interpreted as elements of a discrete syntax or semantics, and analog when interpreted as elements of a continuous syntax or semantics.

    E.g. Goodman's example (above): any letter 'A' inscription is digital read as a token of the letter-type 'A', but analog read as one of an unlimited repertoire of distinct 'A' types, so that vagueness is entailed, at least eventually, by the uncapped expectation of finer and finer discrimination between one element and another.

    So no, a digital system is one that handles data digitally, while a different, 'larger' system may encompass the first one and treat the same data as analog.
  • bongo fury
    1.6k
    1. Given the brain has a digital structure (on/off neurons) how is it that it generates vague concepts?TheMadFool

    Leaving my better :razz: question aside, and going down your road of assuming brain states to be elements of a digital syntax, which I must admit is a respectable road for many people... are you asking whether: the evident vagueness of the (ahem) reference by (I can't believe I'm saying this) those states to external stimuli shows that the repertoire of states and their correlation with stimuli must be encompassed by some larger, inclusive analog syntax and/or semantics?
  • Gnomon
    3.8k
    If we accept that the brain speaks in a digital language then these vague concepts must be translatable into a digital code of off/on neuronal states.TheMadFool
    Maybe the Mind translates from Brain code into "Soul" meaning.

    I suspect that the unstated assumption here is that the conscious Mind communicates with itself in the same digital language that the subconscious Brain uses within its modules. But I'm guessing that the Mind uses a holistic language (words not bytes) when we are consciously thinking. That's because mechanical neural processes (digital machine language) must be converted into a form (ideas, symbols, concepts) that is usable for communicating with other external minds via human language. All spoken languages seem to have evolved from physical body gestures, such as pointing to mean "that" or "you". And gestures are inherently analog, because they can't reproduce the speed & accuracy of digital transmission.

    Therefore, gestures and words are used to symbolize the flow of millions of neural bits into generalized holistic concepts. This simplification of complexity is necessary for the same reason computer code of 1s & 0s must be translated into unbroken chunks of human language, in order for us meat machines to understand. But the precision that is lost in the translation makes our concepts fuzzier. Yet it also makes them more flexible. Hence, the broad range of meanings that you refer to as vagueness.

    NOTE : I have no expertise in such topics. But I was intrigued by your question. So I'm just exploring possible explanations off the top of my head. Does the machine Brain vs soul Mind analogy make sense?

    PS__While the electrical signals in neurons may be digital, neurotransmitters are more holistic. That's why a shot of adrenaline causes a general feeling of fear or arousal, that can only be translated into words with difficulty, because of its vagueness. Emotions are in the middle ground between digital data and conceptual words.

    PPS__Computer Programmers think in one of many human languages, then write in a specialized stripped-down programming language, and finally use a compiler to convert those one-word-one-meaning terms into strings of meaningless digits.


    Hand gestures & spoken language : https://phys.org/news/2019-08-gestures-language.html
  • Tim3003
    347
    1. Given the brain has a digital structure (on/off neurons) how is it that it generates vague concepts?TheMadFool

    How can a digitally produced image show a foggy day?
  • deletedusercb
    1.7k
    1. Given the brain has a digital structure (on/off neurons) how is it that it generates vague concepts?TheMadFool

    They're not on off and it's not just neurons. Google glial cells and intelligence. Glial cells have been discovered to play a role in mind and intelligence and this runs in a different way from neurons. But even some neurons have graded responses.

    And then there's

    While action potentials are usually binary, you should note that synaptic communication between neurons is generally not binary. Most synapses work by neurotransmittors, and this is a chemically mediated graded response that, for example, act on voltage-gated ion channels. So even though action potentials are often binary, communication between neurons are most often not, and action potential firing can involve the integration of synaptic information from many different neurons. Therefore, the brain as a whole cannot be reduced to a binary system.

    and then you have the whole endocrine system which has graded responses and this affects neurons and glial cells.
  • bongo fury
    1.6k
    How can a digitally produced image show a foggy day?Tim3003

    :ok:
  • Possibility
    2.8k
    If I remember my human physiology classes, the language of neurons, the cells that make up the brain, is an action potential which follows the all or none law i.e. either the neuron is firing or it isn't. This basic signalling architecture (on/off) suggests the brain is like a computer, digital.

    However, the mind has created what is probably a huge cache of vague concepts a couple of which I mentioned upstream.

    My questions are:

    1. Given the brain has a digital structure (on/off neurons) how is it that it generates vague concepts?

    2. Does the existence of vague concepts imply that the analog mind is not the same as the digital brain i.e. is the mind not the brain?
    TheMadFool

    Consider this basic digital signalling structure as one dimensional information: a manifest relation between potential.

    Now consider two-dimensional information - a sequence of digital information, except integrated into the system: two different types of neurons firing (or not firing) simultaneously manifest more complex action potential than a single neuron.

    A computer can’t apply the information it receives to itself. A program can, but it can still only manifest one-dimensional information. It is only ever aware of the result: the action, not the potential.

    In this way, the brain is and is not like a computer. Matter is integrated information: an efficient interrelation of analog and digital system structures. The difference between brain/organism and mind is a difference between four and five-dimensional information systems. So, looking at the brain as digital, the mind appears analog; looking at the mind as digital, the brain appears analog.

    Just as digital information can produce the appearance of a continuum, so, too, can integrated information. Matter has gradually integrated all the complexity of information processing into its own manifest structure. At the level of mind, the system is aware of its own variable potential for action in relation to potential information. - hence the vagueness.
  • TheMadFool
    13.8k
    @Coben

    Thank you all for your comments.

    I'm in a bit of a bind here. Allow me to explain

    1. If the brain is digital then each perception and each thought corresponds to a specific combination of off/on neurons which I will call brain state. Let's now take a concept known for its vagueness to wit height. A dwarf evokes a brain state and a giant, as of necessity, must elicit a different brain state for they're two different perceptions and also different thoughts; different perceptions, different brain states and different thoughts, different brain states.

    Tallness is a vague term - it applies not to a specific height but to a range of possible values height can assume. Suppose there's a person who considers someone tall if that person's height is between 6 ft and 8 ft. That means heights of 6.1 ft, 6.5 ft, 7 ft are all tall for this person. What is to be noted here is that each of these heights are distinct perceptions and should evoke distinct brain states and each of these brain states should be different thoughts but this isn't the case: all of the heights 6.1 ft, 6.5 ft and 7 ft are matched to not different but the same brain state, the same thought, the thought tall. This shouldn't be possible if each brain state is a different thought, no? In other words, a digital brain with thoughts being discrete brain states shouldn't be able to generate/handle vague concepts because if it could do that it implies different brain states are not different but the same thought.

    2. Imagine a digital and an analog voltmeter (measures voltage). The analog voltmeter has a dial and is able to read any continuous voltage but the digital voltmeter reads only discrete values such as 0, 1, 2, and so on. Now, the digital voltmeter's measurements involves rounding off voltages and so anything less than 0.5 volts it reads as 0 and anything between 0.5 and 1.5 volts it reads as 1 volt and anything between 1.5 volts and 2.5 volts it reads as 2 volts and so on. The digital voltmeter assigns a range of continuous voltages to a discrete reading that it can display. This is vagueness. This seems to suggest that vagueness is an aspect of digital systems and so, the brain, understood as functioning in discrete brain states (digitally), should generate vague concepts.

    1 & 2 seem to contradict each other. Comments...
  • bongo fury
    1.6k
    1. If the brain is digital then each perception and each thought corresponds to a specific combination of off/on neurons which I will call brain state.TheMadFool

    Yes, let's agree on that, for the sake of argument; or we could (equivalently) discuss the "perceptions" and "thoughts" of the computer.

    A dwarf evokes a brain state and a giant, as of necessity, mustTheMadFool

    , for the sake of argument, might

    elicit a different brain state for they're two different perceptions and also different thoughts; different perceptions, different brain states and different thoughts, different brain states.

    Tallness is a vague term - it applies not to a specific height but to a range of possible values height can assume.
    TheMadFool

    That merely makes it a general term, no?

    That means heights of 6.1 ft, 6.5 ft, 7 ft are all tall for this person. What is to be noted here is that each of these heights are distinct perceptions and should evoke distinct brain states and each of these brain states should be different thoughts but this isn't the case: all of the heights 6.1 ft, 6.5 ft and 7 ft are matched to not different but the same brain state, the same thought, the thought tall.TheMadFool

    Ditto. A brain event might point its "6.1 ft" state at several different persons, and the same event might instantiate also a "tall" state which it points at these and other persons. (Fans of brain state talk may prefer "correlate with" to "point at".) Another instantiated state, "John", might be functioning as a singular term, and "6.1 ft" and "tall" as both general. Or "6.1 ft" and "6.5 ft" might be singular terms each pointing at a singular thing or value while "tall" points at both values and doubtless others.

    This shouldn't be possible if each brain state is a different thought, no?TheMadFool

    Even if you insisted (as some might but I certainly wouldn't) that no word token has the same reference as another token of the same word, that still wouldn't prevent each of them from referring generally to a host of things or values. (Likewise for states and unique brain events as for words and unique tokens.)

    In other words, a digital brain with thoughts being discrete brain states shouldn't be able to generate/handle vague concepts because if it could do that it implies different brain states are not different but the same thought.TheMadFool

    Again, would you substitute "general" for "vague", here? And if not, why not? Either way, this is a point worth debating, but I think it is about generality not vagueness.

    2. Imagine a digital and an analog voltmeter (measures voltage). The analog voltmeter has a dial and is able to read any continuous voltage but the digital voltmeter reads only discrete values such as 0, 1, 2, and so on. Now, the digital voltmeter's measuring involves rounding off voltages and so anything less than 0.5 volts it reads as 0 and anything between 0.5 and 1.5 volts it reads as 1 volt and anything between 1.5 volts and 2.5 volts it reads as 2 volts and so on. The digital voltmeter assigns a range of continuous voltages to a discrete reading that it can display.This is vagueness.TheMadFool

    Yes it entails vagueness if the discrete values are assumed to represent all possible voltages, but usually no, because margins of error are implied in the digitizing which make it feasible to prevent 0 from overlapping with 1, etc. Hence the inherent fidelity of digital reproduction, which amounts to spell checking.

    This seems to suggest that vagueness is an aspect of digital systemsTheMadFool

    It's always an issue lurking at the borders (margins if you like) of the error-margins, and when considering the whole of the digitizing (or reverse) process.

    and so, the brain, understood as functioning in discrete brain states (digitally), should generate vague concepts.TheMadFool

    Probably. Depends how you clarify that. Read 's question, and I recommend also this answer.

    1 & 2 seem to contradict each other.TheMadFool

    With appropriate revisions we hope not. :smile:
  • Possibility
    2.8k
    1. If the brain is digital then each perception and each thought corresponds to a specific combination of off/on neurons which I will call brain state. Let's now take a concept known for its vagueness to wit height. A dwarf evokes a brain state and a giant, as of necessity, must elicit a different brain state for they're two different perceptions and also different thoughts; different perceptions, different brain states and different thoughts, different brain states.

    Tallness is a vague term - it applies not to a specific height but to a range of possible values height can assume. Suppose there's a person who considers someone tall if that person's height is between 6 ft and 8 ft. That means heights of 6.1 ft, 6.5 ft, 7 ft are all tall for this person. What is to be noted here is that each of these heights are distinct perceptions and should evoke distinct brain states and each of these brain states should be different thoughts but this isn't the case: all of the heights 6.1 ft, 6.5 ft and 7 ft are matched to not different but the same brain state, the same thought, the thought tall. This shouldn't be possible if each brain state is a different thought, no? In other words, a digital brain with thoughts being discrete brain states shouldn't be able to generate/handle vague concepts because if it could do that it implies different brain states are not different but the same thought.
    TheMadFool

    I don’t think it’s that simple. Tallness is a relative term - it applies not just to a specific height, or even a range of possible values height can assume. It also includes other relative information to which a height value can be attributed.

    I think ‘tall’ if I need to tilt my head back to make eye contact or see the top, or if a child’s height seems out of proportion to their age in my experience, or if someone or something stands noticeably above those around them. I also think ‘tall’ when my 16yr old stands beside me while I’m not wearing heels - even though she would argue that she’s short in relation to her classmates.

    But while they refer to the same concept, they are not the same thought or perception and not the same brain state. A particular brain state relates to a perception or thought, but isn’t equal to it. Tallness as a vague concept is a pattern of on/off neurons common to a range of different perceptions, different thoughts. Lisa Feldman Barrett’s book ‘How Emotions Are Made’ gives a better explanation than I can offer of the vagueness of conceptual structures from a neuroscience perspective.

    2. Imagine a digital and an analog voltmeter (measures voltage). The analog voltmeter has a dial and is able to read any continuous voltage but the digital voltmeter reads only discrete values such as 0, 1, 2, and so on. Now, the digital voltmeter's measuring involves rounding off voltages and so anything less than 0.5 volts it reads as 0 and anything between 0.5 and 1.5 volts it reads as 1 volt and anything between 1.5 volts and 2.5 volts it reads as 2 volts and so on. The digital voltmeter assigns a range of continuous voltages to a discrete reading that it can display. This is vagueness. This seems to suggest that vagueness is an aspect of digital systems and so, the brain, understood as functioning in discrete brain states (digitally), should generate vague concepts.

    1 & 2 seem to contradict each other. Comments...
    TheMadFool

    I’ve not seen a digital voltmeter that rounds off voltages as you describe, but that’s not to say they don’t exist. Most that I’ve seen give much more accurate information (to several decimal places) than analog versions (which rely on the naked eye), and also employ a variety of system structures to improve resolution and reduce error in converting the analog signal to a digital value.

    You make a point that the ‘continuous’ signal of an analog system implies the existence of missing information between these digital values. The discrete nature of space at a quantum level, however, suggests that an analog signal is not as ‘continuous’ as it appears...
  • Gnomon
    3.8k
    This seems to suggest that vagueness is an aspect of digital systems and so, the brain, understood as functioning in discrete brain states (digitally), should generate vague concepts.TheMadFool

    Your question may be related to the "Hard Problem" of Consciousness" : how do subjective sensations and feelings arise from objective interactions of insensate matter? The answer is not likely to be discovered by reductive analysis, but by holistic synthesis. In any case, it's not easily understood, and may require a new Einstein to simplify it into a formula that can be printed on a T-shirt. However, I suspect that both continuous & discontinuous phenomena are indeed an inherent aspect of our physical (matter) and metaphysical (mind, math) world.

    Some of my own speculations on such questions are derived from the physical phenomenon of Phase Change, and the psychological feature of Model Dependent Realism. Phase transitions (solid, liquid, gas) are common in physics, but poorly understood. Where do the new properties of ice come from, if they are not somehow inherent-but-invisible in the previous form? My (long) answer would be based on the Enformationism theory [Metaphysical Information is more fundamental and essential than Physical Matter]. Model based realism assumes that the human mind does not sense ultimate reality, but constructs its own model from bits & pieces of the whole puzzle. This is a form of Holism [as defined by Jan Smuts, not by New Agers] in which disparate parts add-up to something more than the numerical sum. Philosophical Holism is an alternative perspective to scientific Reductionism. It also involves a "progressive grading of wholes", which Koestler called "holons".

    Another aspect of your question may be found in the math of Fuzzy Logic. Ironically, programmers have learned how to make digital computers think like analog humans, by applying Boolean Algebra algorithms to the otherwise discrete language of 1s & 0s. The result is less certain, but more broadly useful and meaningful answers. The vague answers must be interpreted like poetry instead of mathematics. On Star Trek, Mr. Data's digital-logic required a special Emotion Chip in order to deal with the imprecision of human feelings. :nerd:


    Phase Transitions : https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780195392043.001.0001/oxfordhb-9780195392043-e-6

    Model Dependent Realism : http://bothandblog6.enformationism.info/page21.html

    Holism : https://en.wikipedia.org/wiki/Holism_and_Evolution

    Fuzzy Logic : https://en.wikipedia.org/wiki/Fuzzy_logic
  • TheMadFool
    13.8k
    Another aspect of your question may be found in the math of Fuzzy Logic. Ironically, programmers have learned how to make digital computers think like analog humans, by applying Boolean Algebra algorithms to the otherwise discrete language of 1s & 0s.Gnomon

    @Possibility

    That merely makes it a general term, no?bongo fury

    Again, would you substitute "general" for "vague", here? And if not, why not? Either way, this is a point worth debating, but I think it is about generality not vagueness.bongo fury

    Generality is different to vagueness. The former (generality) is based off of similarities in a class of objects and this isn't true of the latter (vagueness) whose defining feature is an inability to discern differences; in other words, a vague concept doesn't apply to a group of individual instantiations of whatever it is that's vague because there are similarities; au contraire it's that the mind is unable to distinguish the differences.

    Probably. Depends how you clarify that. Read ↪Tim3003's question, and I recommend also this answer.bongo fury
    @Tim3003

    A foggy day's digital picture relies on the eye's inherent limitations to pull off its analog illusion. The eye, unable to make out the discrete pixel data sees the fog as a continuum viz. shades of white. Basically, it's the discrete digital mimicking the continuous analog. What this reveals is that our eyes have a fixed resolution capacity - any difference equal to or smaller than that and we can't tell things apart . Vague concepts are exactly like that - there are certain differences that our senses/minds can't resolve as distinct from each other. Entire chunks of, say, height data elicit the same brain state (here tallness). So, for someone, heights of 6 ft, 6.4 ft and 7ft, all, correspond to the same discrete brain state (tall).

    At this point the sorites paradox enters the fray. Imagine a person X for whom a height of 6 ft and above is tall. So, a height of 6.001 ft is tall for X. Now, take a height of 5.999 ft; this, to X, would be not tall. The difference between 6 ft and 6.001 ft and the difference between 6 ft and 5.999 ft is same viz. an imperceptible 0.001 ft. Noting that tall and not tall are different brain states how is it that the same imperceptible difference, 0.001 ft, causes two different brain states (6.001 is tall and 5.999 is not tall) and then the same brain state (6 ft and 6.001 ft both are tall)? It's as if the brain both perceived and failed to perceive the difference 0.001 ft. This is inexplicable with discrete digital brain state theories because it amounts to saying that the same tiny difference (0.001 ft) produces both the same (difference not perceived) and different brain states (difference perceived).

    One might respond by saying that, no, 5.999 ft is not perceived as not tall by X but actually as tall since what usually happens is the 0.001 ft difference between it and 6ft can't be perceived. If so, then a height of 5.998 ft also is tall and so is a height of 5.997 ft and for every tiny imperceptible change of 0.001 ft in height, X won't/shouldn't be be able to say the given height isn't tall (sorites paradox). In other words, heights that differ by 0.001 ft should evoke the exact same brain state as was the case between 6 ft and 6.001 ft and 5.999 ft and 6 ft (they're all tall for X).

    As the entire height continuum maybe conceived of as a series of values that differ only by the imperceptible value 0.001 ft, doesn't it imply that the entire height continuum should evoke the exact same brain state? In X's case, all heights should be tall, the one and the same brain state, but that isn't the case because X has the concepts short, normal and tall which are distinct brain states..

    Since the concepts short, normal, and tall (distinct brain states) exist, it implies that the difference 0.001 ft is not imperceptible i.e. we can tell apart height values that differ by 0.001 ft (the mystery deepens). If the brain is operationally discrete and digital then each brain state is a different thought and since every height value is perceived as distinct from other values, it follows that each height value elicits a distinct brain state and that means a distinct thought. It can't be that a range of height values is assigned to one and the same brain state, the same thought, which is the defining characteristic of vagueness. Hence, for an operationally discrete and digital brain, vagueness is impossible.

    Summary:

    1. For X, 6ft is tall and 6.001 ft is also tall (applicable to all vague concepts mutatis mutandis)

    2. Either 5.999 ft and 6 ft are both considered tall by X or 5.999 ft is not tall and 6 ft is tall for X

    3. If 5.999 ft is not tall and 6 ft is tall for X then it implies the difference (0.001 ft) is both perceived (5.999 ft is not tall and 6 ft is tall) and not perceived (6 ft is tall and 6.0001 ft is tall). [consequent is a contradiction]

    4. Not the case that the difference (0.001 ft) is both perceived (5.999 ft is short and 6 ft is tall) and not perceived (6 ft is tall and 6.001 ft is tall)

    5. Not the case that 5.999 ft is not tall and 6 ft is tall (3, 4 modus ponens)

    6. If not the case that 5.999 ft is not tall and 6 ft is tall then both 5.999 ft and 6 ft are tall

    7. If both 5.999 ft and 6 ft are tall then all heights are tall (sorites paradox)

    8. If all heights are tall then the concepts tall, short, normal shouldn't exist

    9. The concepts tall, short, normal exist

    10. All heights are not tall (8, 9 modus tollens)

    11. It's not the case that both 5.999 ft and 6 ft are tall (7, 10 modus tollens)

    12. If it's not the case that both 5.999 ft and 6 ft are tall then the tiny difference 0.001 ft is perceptible

    13. The tiny difference 0.001 ft is perceptible (11, 12 modus ponens)

    14. If the tiny difference 0.001 is perceptible then each possible height value (differing from the next by only by 0.001 ft) should produce a distinct brain state distinguishable from other height-brain states.

    15. Each possible height value (differing from the next by only 0.001 ft) should produce a distinct brain state distinguishable from other height-brain states. (13, 14 modus ponens)

    16. If the brain is operationally discrete/digital then each distinct brain state is a distinct thought

    17. The brain is operationally discrete and digital (assume for conditional proof)

    18. Each distinct brain state is a distinct thought

    19. If each possible height value (differing from the next by only 0.001 ft) should produce a distinct brain state distinguishable from other height-brain states AND each distinct brain state is a distinct thought then each height value should produce a distinct thought

    20. Each height value produces a distinct thought (15, 18, 19 modus ponens)

    21. If each height produces a distinct thought then it's impossible that a range of height values can produce the same thought.

    22. It's impossible that a range of height values can produce the same thought (20, 21, modus ponens)

    23. A range of height values producing the same thought is vagueness (for A, the range of height values 6 ft, 6.001 ft, 5.999 ft are all tall)

    24. Vagueness is impossible (for an operationally discrete and digital brain) (22, 23 identity)

    25. If the brain is operationally discrete and digital then vagueness is impossible (for an operationally discrete and digital brain) (17 to 24 conditional proof)

    26. Vagueness is impossible (for an operationally discrete and digital brain

    27 . If vagueness is impossible (for an operationally discrete and digital brain) then, if vague concepts exist then the mind is different to the brain

    28. If vague concepts exist then the mind is different to the brain (26, 27 modus ponens)

    29. Vague concepts exist

    30. The mind is different to the brain (28, 29 modus ponens)
  • bongo fury
    1.6k


    The surprising thing about a sorites puzzle, or indeed any discussion of vagueness, is you don't need any analog. Analog is sufficient but not necessary for vagueness. You only require a digitally defined series (of discrete but plausibly imperceptibly different values, like your heights to 3 d.p.) and two or three vague adjectives like tall, medium and short.
  • Possibility
    2.8k
    A foggy day's digital picture relies on the eye's inherent limitations to pull off its analog illusion. The eye, unable to make out the discrete pixel data sees the fog as a continuum viz. shades of white. Basically, it's the discrete digital mimicking the continuous analog. What this reveals is that our eyes have a fixed resolution capacity - any difference equal to or smaller than that and we can't tell things apart . Vague concepts are exactly like that - there are certain differences that our senses/minds can't resolve as distinct from each other. Entire chunks of, say, height data elicit the same brain state (here tallness). So, for someone, heights of 6 ft, 6.4 ft and 7ft, all, correspond to the same discrete brain state (tall).TheMadFool

    The surprising thing about a sorites puzzle, or indeed any discussion of vagueness, is you don't need any analog. Analog is sufficient but not necessary for vagueness. You only require a digitally defined series (of discrete but plausibly imperceptibly different values, like your heights to 3 d.p.) and two or three vague adjectives like tall, medium and short.bongo fury

    There is no connection being made here between a digital mind and a digital brain as yet - only between the mind and ‘brain states’ as a concept of mental events. Whether employing a digital or analog perspective of ‘vagueness’, our understanding fails in the ‘conversion process’ between mind and brain. From a digital perspective of mind, there is no continuity - only discrete values and concepts - and so the brain appears relatively analog in nature.

    From a digital or quantum perspective of the brain, however (if we assume the granular nature of time proposed by QFT), then the discrete nature of ‘brain states’ in relation to the ‘vague concepts’ to which they correspond is surprisingly similar to particle-wave duality. I may be completely ignorant of problems with this, but isn’t it possible then to look at ‘brain states’ in the same way that we look at photons?
  • Streetlight
    9.1k
    Quoting from Anthony WIlden's System and Structure:

    "[Von Neumann] points out that what he calls the prima facie digital behavior of the neuron is a simplification. It is true that neurons either fire or do not fire, but this firing may be modified by the recovery time of the neuron. Similarly, a neuron may represent a simple, two-valued logical network: its firing after a combined and/or synchronized stimulation by two connecting synapses represents 'and', and its firing after stimulation from one or the other of two synapses represents 'or'. But most neurons embody synaptic connections with many other neurons. In some cases, several connecting axons or branches (ending in synapses) from one neuron form synapses on the body of another. Moreover, the axons themselves may stimulate or be stimulated by their neighborhood, the 'impulse' then travelling in both directions, towards the neuron and towards the synapse. Thus, quite apart from the estimated 10^12 synaptic connections in the network, and without considering the dendrites or the phenomenon of direct axonal stimulation, the possible patterns of stimulation do not involve only the so-called 'impulse'.

    These patterns probably also include the frequency of the series of impulses in a single axon, the synchronization of impulses from different axons, the number of impulses, and the spatial arrangement of the synapses to which the impulses arrive, as well as the so-called summation time. (This, again, is quite apart from the interrelated physical, mechanical, chemical, and electrical processes in the axon which propagate the message). Some of these aspects, such as frequency, spatial arrangement, and the chemical processes, are analogs.

    Von Neumann also points to the constant switching between the analog and the digital in the behavior of the message systems of the body at another level: a digital command releases a chemical compound which performs some analog function or other, this release or its result is in turn detected by an internal receptor neuron which sends a digital signal to command the process to stop or sets off some other process, and so on... It has been suggested that we think of these processes not in terms of 'impulses', which imply a basically energetic model of what is obviously an information system (which 'triggers' energy in order for 'work' to be done), but rather in terms of logical types and classes. The neuron could be said to fire or not to fire if and only if the requisite analog and digital logical arrangements have been completed".

    That's how.
  • TheMadFool
    13.8k
    The surprising thing about a sorites puzzle, or indeed any discussion of vagueness, is you don't need any analog. Analog is sufficient but not necessary for vagueness. You only require a digitally defined series (of discrete but plausibly imperceptibly different values, like your heights to 3 d.p.) and two or three vague adjectives like tall, medium and short.bongo fury

    I did mention that if discrete data are sufficiently close in value then, it will appear analog. Taking that to its logical conclusion we'd have to say that the analog continuous is nothing more than digital discrete just with tinier steps between. However, this would be an error because analog continuous data can take on any value, values even in between the smallest well-defined interval a discrete digital system can manage.

    The sorites paradox is usually spoken of in terms of the discrete (n then n + 1, then (n + 1) + 1, and so on) but that's just for convenience; in addition the paradox becomes even more problematic once we enter the analog domain.

    The problem with vague concepts in re to what you said is not that we're mistaking discrete digital data for continuous analog data. Height is, without question, continuous analog for it can assume any value; our heights don't increase in discrete intervals. The sorites paradox then isn't a pseudo-problem for vague concepts but in fact reveals, quite clearly in my opinion, that if the brain operates in discrete digital brain states each such state being a different thought then, vagueness should be impossible for it entails that the same tiny difference in height elicit the same discrete brain state (the same thought) AND also that this same tiny difference in height elicit different discrete brain states (different thoughts, borderline cases).
  • I like sushi
    4.8k
    This is a tough post to answer because I’m unsure what you’re asking exactly. I can offer some information though.

    The all or nothing nature of firing neurons has underlying analog features - the amount of neurochemicals trigger this effect. Then there is the recycling of neurochemicals for the synaptic clefts. Needless to say the threshold required to trigger a neuron firing is certainly not dictated purely by the amount of neurochemicals present in the synaptic cleft - Brownian motion plays a part.

    Other than this if we look at GABA neurons (the most common neural inhibitors), they function by effectively stopping neurons from firing when they fire. In simplistic terms the cortex acts as an inhibitor to the midbrain (VERY simplistic, but generally what happens). You can think of this as pure animalistic reactions that are inhibited due to how sensory input is translated in the cortex - some sensory inputs bypass the brain completely (this is why paralysed people put on treadmills can ‘walk’).

    Another thing I’d like to point out is the possible mistake of delineating the ‘brain’ from the ‘body’. The brain is a network that extends to your toes. A ‘brain’ without a body is just pink mush.

    Lastly, the term ‘mind’ is dubious at best. The very term itself is ‘vague’. Without a ‘vague’ concept of something we’d have literally nothing to investigate, and if we had nothing to investigate we wouldn’t be conscious of anything. Essentially ‘knowing’ is ‘questioning/doubting’ that for navigational purposes is framed in the concept of ‘certainty’ in some instances so we have a sense of orientation in order to explore.
  • bongo fury
    1.6k
    Sure, and many people here in this thread have made the obvious but important point that...

    1. Given the brain has a digital structure (on/off neurons)TheMadFool

    is a contestable premise. But also...

    Yes, let's agree on that, for the sake of argument; or we could (equivalently) discuss the "perceptions" and "thoughts" of the computer.bongo fury

    Because then,

    how is it that it generates vague concepts?TheMadFool

    is a good question, requiring but potentially also stimulating clarification of the terms in play. One can but hope.

    The sorites paradox then [...] reveals, quite clearly in my opinion, that if the brain operates in discrete digital brain states [...] then, vagueness should be impossibleTheMadFool

    ... or will look curious and paradoxical, sure. I recommend it (the paradox) as an anvil on which to refashion your ideas about this topic. Equally,

    How can a digitally produced image show a foggy day?Tim3003

    Either way, try to learn to bash the ideas one at a time.
  • TheMadFool
    13.8k
    The two of you are saying that there are analog processes involved in thinking. However, these analog processes are irrelevant because what counts here are thoughts and thoughts are discrete combinations of on/off neurons. It maybe that this isn't the way the brain works but to my knowledge, thinking is the work of neurons and their function, the last I heard, consists of either on (firing) or off (not firing).

    @I like sushi: The mind isn't a vague concept, at least not so vague to be utterly useless for discourse. Here's a working definition from Google:

    Mind (noun): the element of a person that enables them to be aware of the world and their experiences, to think, and to feel; the faculty of consciousness and thought
  • TheMadFool
    13.8k
    :chin: What do you mean?
  • Streetlight
    9.1k
    The two of you are saying that there are analog processes involved in thinking. However, these analog processes are irrelevant because what counts here are thoughts and thoughts are discrete combinations of on/off neurons.TheMadFool

    These two sentences literally contradict each other. Read: sentence two is wrong.
  • I like sushi
    4.8k
    Well, you’re missing the chaotic environment out. The world is part of how we think. GABA neurons inhibit, but if only some fire then they don’t impact the next neurons. The reasons they fire or not are due to ‘external’ conditions.

    Thinking requires constant input. We’re not brains in vats. Also, what exactly do you mean by ‘thinking’? Some people regard thinking as ‘worded thought’ only. That could also be a confusion.

    When it comes to words like ‘tall’ and ‘fat’ it probably helps if we assess the different types of antonyms involved.

    Note: People can only come to disagree from some conscious point of agreement (eg. That we’re awake and conscious of each other as human beings with different views and beliefs).
  • TheMadFool
    13.8k
    These two sentences literally contradict each other. Read: sentence two is wrong.StreetlightX

    No, they don't. Electric current is analog but a computer is digital. The same maybe true of the brain: neurotransmitter concentrations are analog but the neurons are digital (on/off).
  • Streetlight
    9.1k
    The same maybe true of the brainTheMadFool

    It isn't. That's the point.
  • TheMadFool
    13.8k
    Well, you’re missing the chaotic environment out. The world is part of how we think. GABA neurons inhibit, but if only some fire then they don’t impact the next neurons. The reasons they fire or not are due to ‘external’ conditions.

    Thinking requires constant input. We’re not brains in vats. Also, what exactly do you mean by ‘thinking’? Some people regard thinking as ‘worded thought’ only. That could also be a confusion.

    When it comes to words like ‘tall’ and ‘fat’ it probably helps if we assess the different types of antonyms involved.

    Note: People can only come to disagree from some conscious point of agreement (eg. That we’re awake and conscious of each other as human beings with different views and beliefs).
    I like sushi

    I don't see your point at all :chin:
  • TheMadFool
    13.8k
    It isn't. That's the point.StreetlightX

    I could accept that as a possibility.
  • Streetlight
    9.1k
    Reality doesn't give shit what you accept.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.