It rained in Oxford every day this week: not surprising, very little information
It rained in the Sahara desert: surprising, high information content
Information: Shock/surprise value — TheMadFool
It rained in Oxford has the same degree of information as does it rained in the Sahara. — Metaphysician Undercover
"Dog" is only redundant when "poodle" is related to something else such as a definition — Metaphysician Undercover
But if each word of the message needs to be related to something else, like a definition — Metaphysician Undercover
Then why is it surprising that it rained in the Sahara and not that it rained in Oxford? — TheMadFool
I admit that I'm not sure what the logic behind why the shocking/surprising is treated as having more information but if I were to hazard a guess it's got to do with what people refer to as the baseline - the greater the deviation from it, the more bits (units of information) necessary to code it into a message and the shocking/surprising certainly are major excursions from the..er...baseline, right? Food for thought: why is news "news"? New, shocking, surprising, out of the ordinary,... — TheMadFool
Claude Shannon's information theory assumes that we've already passed those waypoints in our quest to understand quantify, efficiently transmit, information. Shannon's information theory is,whatever else it might be, not a philosophical inquiry into information and so we best refrain ourselves from raising philosophical objections to it - that would be like trying to diagnose a uterine malady in a man. — TheMadFool
As I pointed out, the surprisingness is only related to external information concerning the frequency of rain in these places, it has nothing to do with any supposed information within the message. — Metaphysician Undercover
This is evidence of what I said, the "information" as the word is used here, is not within the message, it is in how the message is related to the "baseline" — Metaphysician Undercover
If the accepted "information theory" represents information in a way other than the way that we normally use the word "information", and cannot account for the existence of information, according to how we normally use the word, as that which is transmitted in a message, then surely we are justified in "raising philosophical objections to it".
What I am saying therefore, is that Shannon's "information theory" does not deal with "information" at all, as we commonly use the word. If we do not recognize this, and the ambiguity which arises, between the common use, and the use within the theory, we might inadvertently equivocate and think that the theory deals with "information" as what is referred to when we commonly use the word to refer to what is inherent within a message. — Metaphysician Undercover
What else could surprising/shocking mean? Also, what do you mean by "it has nothing to do with any supposed information within the message"? How would you come by information without a message and a medium for that message? If for instance, I read about rain in the Sahara, the message is the article on what is indeed a very rare event and the medium is the paper I'm reading. :chin: — TheMadFool
Glad that you figured that out. — TheMadFool
Thanks for your patience. I do agree that Claude Shannon's theory is not the only game in town insofar as information is concerned. I remember reading about another less-popular theory that's also out there. However, in the universe of computers, the world of 1's and 0's, in which it's almost a given that one binary state (1 or 0) should correspond to 1 unit of information, Claude Shannon's conceptualization of information as a process of reducing uncertainty of which alternative is true among [idealized] equiprobable alternatives has a natural and intuitive feel to it. The least amount uncertainty happens when we have two alternatives (0 or 1) and knowing that 1 or 0 is the case reduces the uncertainty to zero (only 1 of the two alternatives remain) and this suggests that for computers at least a 1 or a 0 should count as 1 unit of information. If the uncertainty is 4 alternatives, you would need 2 units of information to bring the uncertainty down to zero, and if the uncertainty is 8 alternatives, you'd need 3 units of information to make the uncertainty = 0 and that means the information content of a message that whittles down N equiprobable alternatives to 1 = lo2(N). This is a perfect fit for what I said a few lines above - that a 1 or a 0 should count as 1 unit of information as log2(2) = 1. — TheMadFool
By the way, this just popped into my mind. Information is, in some sense, the opposite of ignorance and ignorance can be thought of as uncertainty among given alternatives e.g. If I don't know i.e. I'm ignorant of (I have no information on) who invented the Information theory then this state of not knowing can be expressed as consisting of the following equiprobable alternatives, just as Claude Shannon theorized, Vint Cerf OR Larry Page OR Mark Zuckerberg OR Claude Shannon... — TheMadFool
Figured what out, that Shannon is using "information" in a way which is completely inconsistent with common usage? I said that right from the beginning. The question is have you figured that out yet? — Metaphysician Undercover
The issue now is the relationship between uncertainty and information. In the normal, common expression of "information", some degree of uncertainty is inherent within the information itself, as ambiguity. In the way that you describe Shannon's expression of "information", information is the process which excludes uncertainty. Do you see the difference? Now the problem with Shannon's representation is that it cannot cope with the real uncertainty which is associated with ambiguity. — Metaphysician Undercover
Again, this is not consistent with the common usage of "information". In common usage information is what informs a person, to deliver one from ignorance, and so being informed is the opposite of ignorance, but information, as that which informs, is not itself the opposite of ignorance. So, that information is the opposite to ignorance, is a category mistake relative to the common usage of "information". — Metaphysician Undercover
Charged with maximizing the flow of communication, Shannon was interested in measuring the carrying capacity of the system, not the meaningful content of each message. That's like a shipping company, which is more interested in the potential (carrying capacity) of its empty vessels, while the shippers are interested in the cash-value (meaning) of the actual cargo.If the accepted "information theory" represents information in a way other than the way that we normally use the word "information", and cannot account for the existence of information, according to how we normally use the word, as that which is transmitted in a message, then surely we are justified in "raising philosophical objections to it". — Metaphysician Undercover
See my reply to above. :smile:What is this "common usage" of "information" that you speak of? — TheMadFool
orry, but you seem to be contradicting yourself. Please go over your posts again. — TheMadFool
Are you implying we can cope with uncertainty? — TheMadFool
Uncertainty, ambiguity being one of its causes, comes with the territory and it can't be, to my reckoning, dealt with in a satisfactory manner by any theory of information, whether based on certainty or uncertainty, let alone Claude Shannon's. So, your criticism is more appropriate for language than Shannon's theory. — TheMadFool
What is this "common usage" of "information" that you speak of?
Google gives the following definition of information: facts provided or learned about something or someone — TheMadFool
Once you come by the information that the Eiffel tower is in Paris, the uncertainty becomes 0. — TheMadFool
Shannon's theory is probably just one of many other ways to approach the subject of information but it, for certain, captures the uncertainty aspect. — TheMadFool
Charged with maximizing the flow of communication, Shannon was interested in measuring the carrying capacity of the system, not the meaningful content of each message. That's like a shipping company, which is more interested in the potential (carrying capacity) of its empty vessels, while the shippers are interested in the cash-value (meaning) of the actual cargo. — Gnomon
Toward that end, Shannon focused on the Syntax of Information (structure ; volume) instead of its Semantics (meaning ; content). — Gnomon
Specificity, it maximizes Potential. Hence, each bit/byte, instead of carrying meaning, is an empty container capable of carrying multiple meanings. That kind of communication is good for computers -- where the translation code-key is built in -- but not for people, who can't handle uncertainty & ambiguity. — Gnomon
I agree with your version, but what I said was that "by reducing specificity" -- which increases generality -- Shannon's definition of Information "maximizes the Potential" carrying capacity (bandwidth) of a transmission. That was the point of his research. By using only an austere two digit code, instead of noisy redundant human languages, he was able to compress more information into the same pipes. Just as with Morse code though, the specific meaning is restored by translating the abstract code back into a concrete language. Only then, does it become Actual Information -- meaning in a mind; actionable knowledge.I would say that you might have this backward. The computer can't handle uncertainty, that's why there must be a built-in code-key to eliminate any uncertainty. People, having free will choice have no such built-in code-key, and that capacity to choose regardless of uncertainty, allows them to live with and cope with ambiguity. — Metaphysician Undercover
I think you must have misunderstood. If you perceive a contradiction, then point it out to me so I can see what you're talking about, and maybe clarify what I meant. — Metaphysician Undercover
Yes, of course I am saying that, that's what I said in my first post in the thread, uncertainty is a fundamental aspect of language use, and we clearly cope with it. — Metaphysician Undercover
I agree with your version, but what I said was that "by reducing specificity" -- which increases generality -- Shannon's definition of Information "maximizes the Potential" carrying capacity (bandwidth) of a transmission. That was the point of his research. By using only an austere two digit code, instead of noisy redundant human languages, he was able to compress more information into the same pipes. Just as with Morse code though, the specific meaning is restored by translating the abstract code back into a concrete language. Only then, does it become Actual Information -- meaning in a mind; actionable knowledge. — Gnomon
But in order for the code to be meaningful to humans, it must be decompressed and converted back into noisy redundant "natural" language. — Gnomon
If I don't have the information on who invented the internet, does it seem ok to represent my lack of information as: Mark Zuckerberg OR Jeff Bezos OR Vint Cerf OR Bill Gates? — TheMadFool
Remember that a question is defined as a request for information and and as you can see in the above example a question can be reduced to a list of alternatives - there are 4 for the question Q above and Q can be expressed, without any loss of meaning, as A. — TheMadFool
Small point, likely you know this. Shannon made clear that part of efficient encoding was what the prior understandings of the sender and recipient could contribute. As example - as you-all have observed - there is often redundancy in messages. He argued that the recipient of the message often did not need the redundancy. Taking this to the limit, he wrote that the entropy of the English language was less than two bits(!). I believe an implied qualification was that the message be sufficiently long, but maybe not.I don't see how you can describe that as a matter of reducing specificity for an increase in generality. — Metaphysician Undercover
No I don't think that would be correct. If the correct answer might be 1&2&3&4, you cannot represent it properly as 1 or 2 or 3 or 4. You need to have reason to know that it is either/or, which you don't give — Metaphysician Undercover
The problem with this approach is that you need to know that the correct answer is within the list of options, which will only occur if you already know the correct answer. So such a request for information has an extremely limited applicability, like a multiple choice exam. — Metaphysician Undercover
Unfortunately, that's his real name. And he is fringey, in the sense of revolutionary. I have read a Kindle copy of his book, Quantum Evolution, because it seemed have some parallels to my own edgey Enformationism thesis of how evolution works. He concluded that there seemed to a "force of will" behind biological evolution. And I have concluded that the Generic Form of Information -- that I call EnFormAction -- is poetically analogous to the Will-of-God in religious myths of creation. So, I find his combination of Quantum Theory and Biology to be interesting -- and provocative, if not provable. But of course, it doesn't fit neatly into the dominant scientific worldview of Materialism.You might be interested in this academic. He sounds a bit fringe to me, but I have to admit, his electromagnetic theory of consciousness seems plausible (although I must confess to scepticism about anything authored by someone who calls themselves 'Johnjoe'. :worry: ) — Wayfarer
The profundity of Information Theory is only partly due to it's opening the door to the Information Age. But we have, since Shannon's re-definition of Mind Stuff, begun to go far beyond mere artificial computer brains, to glimpse an answer to the "hard question" of natural Consciousness. Shannon's narrow definition of "Information" is blossoming into a whole new worldview. :wink:Shannon might have coined the term 'bit' for 'binary digit' - and transmitting them through a medium. Why it is now taken to have a profound meaning about the nature of reality baffles me a little. — Wayfarer
Sorry for the confusion. As an amateur philosopher, I'm in over my head. But, if you have any interest in a deeper discussion of what I'm talking about, I can direct you to several books by physicist Paul Davies, and associates, who are exploring the concept of Information far beyond Shannon's novel use of the old word for personal-Knowledge-encoded-in-a-physical-brain to a new application of abstract-Values-encoded-in-the-meaningless-mathematics-of-Probability. :brow:I don't see how you can describe that as a matter of reducing specificity for an increase in generality. It's the very opposite of that. — Metaphysician Undercover
Apparently, I haven't clearly conveyed that my intention is to understand "the real natural thing" instead of "the artificial thing which goes by the same name". Don't worry about the "specificity" and "generality" of information. That's a tricky technical distinction for information specialists to gnaw on. For the rest of us, the important distinction is between statistical Probability and meaningful Aboutness. :cool:Thinking that this is an accurate representation of "information", is the problem of representation, or narrative, which Plato warned us about. We have three layers, the real natural thing, the artificial thing which goes by the same name, but is just a shallow reflection of the thing — Metaphysician Undercover
(The Hunting of the Snark)And what I say three times is true. — The Bellman
I'm working under the assumption that only one alternative will be correct and the Shannon's logic works perfectly well in that case. As for the possibility of 1&2&3&4, the question or uncertainty would need to be reframed like so: 1&2&3&4 OR 5 OR 7&8 OR... As you can see all questions the uncertainty of the anwer can be reexpressed as a disjunction. — TheMadFool
Yes but what's wrong with that? Shannon's theory is about messages and their information content - how the message containing the information brings our uncertainty regarding the answer to a question [a request for information] to zero; another way of saying that is eliminating alternative answers to the question until we're left with only one - the correct answer. — TheMadFool
As for your claim that "...such a request for information has an extremely limited applicability..." think of how the very foundation of the modern world - science - operates. A specific set of observations are made and not one but multiple hypotheses are formulated as an explanation. One of the stages in the scientific method is the question, "which hypothesis is true/best?" Hypothesis 1 OR Hypothesis 2 OR...Hypothesis N? Shannon's uncertainty shows up again and in vital area of humanity's experience with information. Here too, we must eliminate alternatives until we arrive at the best hypothesis. — TheMadFool
Apparently, I haven't clearly conveyed that my intention is to understand "the real natural thing" instead of "the artificial thing which goes by the same name". — Gnomon
Shannon's context is information transfer: I post - you read. We are used to a faithfully exact transfer; that what you read is exactly what I wrote, complete with typos. We are used in this context, to the elimination of noise. And this is done by the application of Shannon's theory. — unenlightened
Raw Energy is first transformed into active Life, and then into sensing Mind, and ultimately into knowing Consciousness. — Gnomon
In an epilogue called “The Return of Meaning,” Gleick argues that to understand how information gives rise to belief and knowledge, we have to renounce Shannon’s “ruthless sacrifice of meaning,” which required jettisoning “the very quality that gives information its value.” But Shannon wasn’t sacrificing meaning so much as ignoring it, in the same way that a traffic engineer doesn’t care what, if anything, the trucks on the highway are carrying. Once you start to think of information as something meaningful, you have to untether it from its mathematical definition, which leaves you with nothing to go on but the word itself. And in its ordinary usage, “information” is a hard word to get a handle on (even after a recent revision, the Oxford English Dictionary still makes a hash of its history). It’s one of those words, like “objectivity” and “literacy,” that enable us to slip from one meaning to the next without letting on, even to ourselves, that we’ve changed the subject.
Yeah! That's the ticket : "Inversion" -- a mental flip of the coin. When I said that Shannon's Information substituted "generality" for "specificity", I was referring to the meaning of communication. Shannon's technique was to eliminate the specific intended meaning of Words for enigmatic numerical Bytes. Digital information is conveyed in the abstract language of binary numbers that have the potential to encode any meaning. It's a sort of universal language. But Mathematics is divorced from concrete Reality, in that it is universal instead of specific. That's why String Theory makes sense to mathematicians, and not to laymen, but cannot be empirically tested in the real world.I think that what happens is that at each distinct level there is an inversion of importance, from the particular to the general, and then back again when you cross the next level. — Metaphysician Undercover
The world-creating Potential of the Big Bang Singularity was transformed (enformed) into Life, the Universe, and Everything by the power of EnFormAction. This is a novel notion, perhaps even radical. But it is being studied by serious scientists -- some of whom even entertain the taboo concept of Deity, or Panpsychism. I have simply translated that unconventional interpretation of Generic Information into a new myth of creation, that I call Enformationism. This is based on Einstein's theory of E = MC^2, and the current understanding of physicists that Information transforms into Energy, which transforms into Matter, and vice versa. See the Hypothesis below for the "how". :nerd:Raw Energy is first transformed into active Life, and then into sensing Mind, and ultimately into knowing Consciousness. — Gnomon
Transformed by what, and how? — Wayfarer
Yes. My Enformationism thesis can be viewed as an update of Spinoza's worldview, in light of Quantum Physics, bottom-up Evolution, and Information Theory. :smile:In Spinoza's philosophy, which I'll take to be paradigmatic for philosophy generally in this case, the only real substance ('substance' being nearer in meaning to 'subject' or to 'being' than the current conception of 'substance') is self-caused, it exists in itself and through itself. In other words, it is not derived from anything, whereas everything else is derived from that. (This is Spinoza's doctrine of God as nature.) — Wayfarer
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.