1. If the brain is digital then each perception and each thought corresponds to a specific combination of off/on neurons which I will call brain state. — TheMadFool
Yes, let's agree on that, for the sake of argument; or we could (equivalently) discuss the "perceptions" and "thoughts" of the computer.
A dwarf evokes a brain state and a giant, as of necessity, must — TheMadFool
, for the sake of argument, might
elicit a different brain state for they're two different perceptions and also different thoughts; different perceptions, different brain states and different thoughts, different brain states.
Tallness is a vague term - it applies not to a specific height but to a range of possible values height can assume. — TheMadFool
That merely makes it a general term, no?
That means heights of 6.1 ft, 6.5 ft, 7 ft are all tall for this person. What is to be noted here is that each of these heights are distinct perceptions and should evoke distinct brain states and each of these brain states should be different thoughts but this isn't the case: all of the heights 6.1 ft, 6.5 ft and 7 ft are matched to not different but the same brain state, the same thought, the thought tall. — TheMadFool
Ditto. A brain event might point its "6.1 ft" state at several different persons, and the same event might instantiate also a "tall" state which it points at these and other persons. (Fans of brain state talk may prefer "correlate with" to "point at".) Another instantiated state, "John", might be functioning as a singular term, and "6.1 ft" and "tall" as both general. Or "6.1 ft" and "6.5 ft" might be singular terms each pointing at a singular thing or value while "tall" points at both values and doubtless others.
This shouldn't be possible if each brain state is a different thought, no? — TheMadFool
Even if you insisted (as some might but I certainly wouldn't) that no word token has the same reference as another token of the same word, that still wouldn't prevent each of them from referring generally to a host of things or values. (Likewise for states and unique brain events as for words and unique tokens.)
In other words, a digital brain with thoughts being discrete brain states shouldn't be able to generate/handle vague concepts because if it could do that it implies different brain states are not different but the same thought. — TheMadFool
Again, would you substitute "general" for "vague", here? And if not, why not? Either way, this is a point worth debating, but I think it is about generality not vagueness.
2. Imagine a digital and an analog voltmeter (measures voltage). The analog voltmeter has a dial and is able to read any continuous voltage but the digital voltmeter reads only discrete values such as 0, 1, 2, and so on. Now, the digital voltmeter's measuring involves rounding off voltages and so anything less than 0.5 volts it reads as 0 and anything between 0.5 and 1.5 volts it reads as 1 volt and anything between 1.5 volts and 2.5 volts it reads as 2 volts and so on. The digital voltmeter assigns a range of continuous voltages to a discrete reading that it can display.This is vagueness. — TheMadFool
Yes it entails vagueness if the discrete values are assumed to represent all possible voltages, but usually no, because margins of error are implied in the digitizing which make it feasible to prevent 0 from overlapping with 1, etc. Hence the inherent fidelity of digital reproduction, which amounts to spell checking.
This seems to suggest that vagueness is an aspect of digital systems — TheMadFool
It's always an issue lurking at the borders (margins if you like) of the error-margins, and when considering the whole of the digitizing (or reverse) process.
and so, the brain, understood as functioning in discrete brain states (digitally), should generate vague concepts. — TheMadFool
Probably. Depends how you clarify that. Read
's question, and I recommend also this
answer.
1 & 2 seem to contradict each other. — TheMadFool
With appropriate revisions we hope not.
:smile: