• Wayfarer
    22.3k
    As Claude Shannon's information theory is discussed so much on the Forum, this article might be of interest. It also includes a link to a current documentary on his work.

    https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/

    And, Merry Christmas to all the Forum contributors, mods and staff. :sparkle:
  • Kenosha Kid
    3.2k
    Merry Christmas!
  • Metaphysician Undercover
    13.1k
    Nice article Wayfarer. It describes how what is foundational, or basic to communication is uncertainty. This is contrary to what many on this forum argue, that certainty is the basis for, as necessary for, communication. It demonstrates clearly what Wittgenstein describes in his "Philosophical Investigations", how communication is fundamentally based in uncertainty, and as a practise, it is an attempt to reduce such uncertainty to a level which limits the possibility of mistake, contrary to the common notion that communication utilizes elements of crystalline meaning, ideals of certainty, utilizing a structure built on this.

    Merry Christmas!
  • TheMadFool
    13.8k
    From my notes on Claude Shannon's Information Theory as contained in Richard Dawkins' book The Devil's Chaplain

    Message = Information + Redundancy + Noise

    Information = What is to be conveyed (worth paying for)

    Redundancy = Unnecessary duplication (can be deleted)

    Noise = Random rubbish

    Rover is a poodle dog: Dog redundant

    It rained in Oxford every day this week: not surprising, very little information

    It rained in the Sahara desert: surprising, high information content

    Information: Shock/surprise value

    Ignorance BEFORE receiving message - ignorance AFTER receiving message = Information content = The amount of ignorance reduction.

    If the prior uncertainty can be expressed as a number of equiprobable alternatives N, the information content of message which narrows those alternatives down to one is Log2(N). This might need a little explaining. What's the smallest possible uncertainty? If you have 5 equiprobable alternatives, the uncertainty is 5; if you have 4 equiprobable alternatives, the uncertainty is 4, and so on until you get to 2 alternatives. It can't be 1 alternative because then there's no uncertainty at all. So, the smallest value of uncertainty is when you have 2 alternatives. If now, you know one of the alternatives, the uncertainty reduces to 0. That means, the alternatives have reduced to only 1 and the uncertainty has dropped to 0. If information is encoded in binary (0 or 1), that means knowing either that it's 0 or 1 should result in alternatives = 1 ( the 1 or 0 you found out was the case) and uncertainty = 0. In other words, in binary code a 0 or 1 is 1 unit of information. How do we get a 1 (unit of information) from 2 (alternatives)? Well, log2(2) = 1.

    Imagine the following scenario: A nurse and a soon-to-be father agree that if the nurse holds up a pink card, the baby is a girl and if the nurse holds up a blue card, the baby is boy. The nervous father is outside the delivery room and, after some tense moments, he sees the nurse with a pink card in her hand. The initial uncertainty is 2, the father doesn't know if the baby is a girl or a boy. With the pink card, the uncertainty is halved, as now the father knows it's a girl (1 of the 2 possibilities has actualized). The information content of the colored card = 1 bit. Compare this colored-card system with the nurse saying, "Congratulations sir, you have a beautiful baby girl". [8 word] Remember, the information content is the same (identification of the gender of the baby)

    In digital code there are 0's and 1's. There are two possibilities (0 or a 1) i.e. the uncertainty is two. As soon as we know it's a 1 or a 0, the uncertainty is halved (one of the two possibilities becomes real/true). So, information content in a digital 0 or 1 is 1 bit [log2(2) = 1]

    A DNA example:

    DNA has a 4-letter nucleotide code [Adenine (A), Thymine (T), Cytosine (C) and Guanine (G)]. The initial uncertainty here is 4 as there are 4 possibilities. The information content of each nucleotide is then log2(4) = 2 bits. This can be understood in terms of what 1 bit of information means. 1 bit of information is that amount of information necessary to halve the uncertainty from 2 to 1. We have 4 possibilities (A/C/T/G). To reduce the number of possibilities to 2, we can ask the question, "does the letter precede D?". If yes then the possibilities are to A/C and if the answer is no, the possibilities become to T/G. We have halved the uncertainty (4 possibilities have reduced to 2 possibilities) and that accounts for 1 bit of information. The next question to ask depends on which, A/C or G/T, possibilities followed from the first question. If it's A/C, the follow-up question should be, "is it A?" If it's A then it's A and if it's not A then it's C. If the possibility that actualized is G/T, the next question should be, "Is it G?" If it's G then it's G and if it isn't G then it's T. Either way, the uncertainty is halved (from two possibilities it became one). This is the other 1 bit. Together, we have 2 bits of information in every nucleotide.
  • Metaphysician Undercover
    13.1k
    It rained in Oxford every day this week: not surprising, very little information

    It rained in the Sahara desert: surprising, high information content

    Information: Shock/surprise value
    TheMadFool

    I don't think this is a valid conclusion. It rained in Oxford has the same degree of information as does it rained in the Sahara. The "shock/surprise value" refers only to how the piece of information relates to other information. If you allow that information is both, what is intrinsic to the message, and also how that message relates to other messages externally, then you have ambiguity as to what "information" actually refers to.

    This is very evident in your example of "Rover is a poodle dog". "Dog" is only redundant when "poodle" is related to something else such as a definition. But if each word of the message needs to be related to something else, like a definition, then there is really no meaning in the message at all because the meaning is outside of the message in the act which relates the words to the definitions. The ambiguity is avoided by realizing that there is no information in the words, the information is in the act which relates the words to something else. What the words could mean, is anything, therefore random, and there is actually no information within the message. Information would actually be in the relationships established between the message and something else.

    If this is the case, then to talk about there being "information" within the message is false talk. But, if there is no information within the message itself, we deprive ourselves of the capacity for explaining how any information gets from the transmitter to the receiver. There is actually no transmission of information whatsoever, nothing within the message, because all the information is really within the coding and decoding methods. If this is the case, then no information is ever transmitted. Therefore this way of defining "information" is really inadequate. It only appears to be adequate by means of the ambiguity which creates the impression that there is both information within the message and in how the message relates to other things. In reality there is no information within the message in this description.
  • TheMadFool
    13.8k
    It rained in Oxford has the same degree of information as does it rained in the Sahara.Metaphysician Undercover

    Then why is it surprising that it rained in the Sahara and not that it rained in Oxford? I admit that I'm not sure what the logic behind why the shocking/surprising is treated as having more information but if I were to hazard a guess it's got to do with what people refer to as the baseline - the greater the deviation from it, the more bits (units of information) necessary to code it into a message and the shocking/surprising certainly are major excursions from the..er...baseline, right? Food for thought: why is news "news"? New, shocking, surprising, out of the ordinary,...

    "Dog" is only redundant when "poodle" is related to something else such as a definitionMetaphysician Undercover

    That's correct but that goes without saying, right?

    But if each word of the message needs to be related to something else, like a definitionMetaphysician Undercover

    Claude Shannon's information theory assumes that we've already passed those waypoints in our quest to understand quantify, efficiently transmit, information. Shannon's information theory is,whatever else it might be, not a philosophical inquiry into information and so we best refrain ourselves from raising philosophical objections to it - that would be like trying to diagnose a uterine malady in a man.
  • Metaphysician Undercover
    13.1k
    Then why is it surprising that it rained in the Sahara and not that it rained in Oxford?TheMadFool

    As I pointed out, the surprisingness is only related to external information concerning the frequency of rain in these places, it has nothing to do with any supposed information within the message.

    I admit that I'm not sure what the logic behind why the shocking/surprising is treated as having more information but if I were to hazard a guess it's got to do with what people refer to as the baseline - the greater the deviation from it, the more bits (units of information) necessary to code it into a message and the shocking/surprising certainly are major excursions from the..er...baseline, right? Food for thought: why is news "news"? New, shocking, surprising, out of the ordinary,...TheMadFool

    This is evidence of what I said, the "information" as the word is used here, is not within the message, it is in how the message is related to the "baseline".

    Claude Shannon's information theory assumes that we've already passed those waypoints in our quest to understand quantify, efficiently transmit, information. Shannon's information theory is,whatever else it might be, not a philosophical inquiry into information and so we best refrain ourselves from raising philosophical objections to it - that would be like trying to diagnose a uterine malady in a man.TheMadFool

    If the accepted "information theory" represents information in a way other than the way that we normally use the word "information", and cannot account for the existence of information, according to how we normally use the word, as that which is transmitted in a message, then surely we are justified in "raising philosophical objections to it".

    What I am saying therefore, is that Shannon's "information theory" does not deal with "information" at all, as we commonly use the word. If we do not recognize this, and the ambiguity which arises, between the common use, and the use within the theory, we might inadvertently equivocate and think that the theory deals with "information" as what is referred to when we commonly use the word to refer to what is inherent within a message.
  • TheMadFool
    13.8k
    As I pointed out, the surprisingness is only related to external information concerning the frequency of rain in these places, it has nothing to do with any supposed information within the message.Metaphysician Undercover

    What else could surprising/shocking mean? Also, what do you mean by "it has nothing to do with any supposed information within the message"? How would you come by information without a message and a medium for that message? If for instance, I read about rain in the Sahara, the message is the article on what is indeed a very rare event and the medium is the paper I'm reading. :chin:

    This is evidence of what I said, the "information" as the word is used here, is not within the message, it is in how the message is related to the "baseline"Metaphysician Undercover

    Glad that you figured that out.

    If the accepted "information theory" represents information in a way other than the way that we normally use the word "information", and cannot account for the existence of information, according to how we normally use the word, as that which is transmitted in a message, then surely we are justified in "raising philosophical objections to it".

    What I am saying therefore, is that Shannon's "information theory" does not deal with "information" at all, as we commonly use the word. If we do not recognize this, and the ambiguity which arises, between the common use, and the use within the theory, we might inadvertently equivocate and think that the theory deals with "information" as what is referred to when we commonly use the word to refer to what is inherent within a message.
    Metaphysician Undercover

    Thanks for your patience. I do agree that Claude Shannon's theory is not the only game in town insofar as information is concerned. I remember reading about another less-popular theory that's also out there. However, in the universe of computers, the world of 1's and 0's, in which it's almost a given that one binary state (1 or 0) should correspond to 1 unit of information, Claude Shannon's conceptualization of information as a process of reducing uncertainty of which alternative is true among [idealized] equiprobable alternatives has a natural and intuitive feel to it. The least amount uncertainty happens when we have two alternatives (0 or 1) and knowing that 1 or 0 is the case reduces the uncertainty to zero (only 1 of the two alternatives remain) and this suggests that for computers at least a 1 or a 0 should count as 1 unit of information. If the uncertainty is 4 alternatives, you would need 2 units of information to bring the uncertainty down to zero, and if the uncertainty is 8 alternatives, you'd need 3 units of information to make the uncertainty = 0 and that means the information content of a message that whittles down N equiprobable alternatives to 1 = lo2(N). This is a perfect fit for what I said a few lines above - that a 1 or a 0 should count as 1 unit of information as log2(2) = 1.

    By the way, this just popped into my mind. Information is, in some sense, the opposite of ignorance and ignorance can be thought of as uncertainty among given alternatives e.g. If I don't know i.e. I'm ignorant of (I have no information on) who invented the Information theory then this state of not knowing can be expressed as consisting of the following equiprobable alternatives, just as Claude Shannon theorized, Vint Cerf OR Larry Page OR Mark Zuckerberg OR Claude Shannon...
  • Metaphysician Undercover
    13.1k
    What else could surprising/shocking mean? Also, what do you mean by "it has nothing to do with any supposed information within the message"? How would you come by information without a message and a medium for that message? If for instance, I read about rain in the Sahara, the message is the article on what is indeed a very rare event and the medium is the paper I'm reading. :chin:TheMadFool

    The information is "it rained in the Sahara", just like in the other instance, the information is "it rained in Oxford". How is whether or not this is surprising or shocking, at all relevant to the content of the information?

    Glad that you figured that out.TheMadFool

    Figured what out, that Shannon is using "information" in a way which is completely inconsistent with common usage? I said that right from the beginning. The question is have you figured that out yet?

    Thanks for your patience. I do agree that Claude Shannon's theory is not the only game in town insofar as information is concerned. I remember reading about another less-popular theory that's also out there. However, in the universe of computers, the world of 1's and 0's, in which it's almost a given that one binary state (1 or 0) should correspond to 1 unit of information, Claude Shannon's conceptualization of information as a process of reducing uncertainty of which alternative is true among [idealized] equiprobable alternatives has a natural and intuitive feel to it. The least amount uncertainty happens when we have two alternatives (0 or 1) and knowing that 1 or 0 is the case reduces the uncertainty to zero (only 1 of the two alternatives remain) and this suggests that for computers at least a 1 or a 0 should count as 1 unit of information. If the uncertainty is 4 alternatives, you would need 2 units of information to bring the uncertainty down to zero, and if the uncertainty is 8 alternatives, you'd need 3 units of information to make the uncertainty = 0 and that means the information content of a message that whittles down N equiprobable alternatives to 1 = lo2(N). This is a perfect fit for what I said a few lines above - that a 1 or a 0 should count as 1 unit of information as log2(2) = 1.TheMadFool

    The issue now is the relationship between uncertainty and information. In the normal, common expression of "information", some degree of uncertainty is inherent within the information itself, as ambiguity. In the way that you describe Shannon's expression of "information", information is the process which excludes uncertainty. Do you see the difference? Now the problem with Shannon's representation is that it cannot cope with the real uncertainty which is associated with ambiguity.

    By the way, this just popped into my mind. Information is, in some sense, the opposite of ignorance and ignorance can be thought of as uncertainty among given alternatives e.g. If I don't know i.e. I'm ignorant of (I have no information on) who invented the Information theory then this state of not knowing can be expressed as consisting of the following equiprobable alternatives, just as Claude Shannon theorized, Vint Cerf OR Larry Page OR Mark Zuckerberg OR Claude Shannon...TheMadFool

    Again, this is not consistent with the common usage of "information". In common usage information is what informs a person, to deliver one from ignorance, and so being informed is the opposite of ignorance, but information, as that which informs, is not itself the opposite of ignorance. So, that information is the opposite to ignorance, is a category mistake relative to the common usage of "information".
  • TheMadFool
    13.8k
    Figured what out, that Shannon is using "information" in a way which is completely inconsistent with common usage? I said that right from the beginning. The question is have you figured that out yet?Metaphysician Undercover

    Sorry, but you seem to be contradicting yourself. Please go over your posts again.

    The issue now is the relationship between uncertainty and information. In the normal, common expression of "information", some degree of uncertainty is inherent within the information itself, as ambiguity. In the way that you describe Shannon's expression of "information", information is the process which excludes uncertainty. Do you see the difference? Now the problem with Shannon's representation is that it cannot cope with the real uncertainty which is associated with ambiguity.Metaphysician Undercover

    Are you implying we can cope with uncertainty? Uncertainty, ambiguity being one of its causes, comes with the territory and it can't be, to my reckoning, dealt with in a satisfactory manner by any theory of information, whether based on certainty or uncertainty, let alone Claude Shannon's. So, your criticism is more appropriate for language than Shannon's theory.

    Again, this is not consistent with the common usage of "information". In common usage information is what informs a person, to deliver one from ignorance, and so being informed is the opposite of ignorance, but information, as that which informs, is not itself the opposite of ignorance. So, that information is the opposite to ignorance, is a category mistake relative to the common usage of "information".Metaphysician Undercover

    What is this "common usage" of "information" that you speak of?

    Google gives the following definition of information: facts provided or learned about something or someone

    So, what is a fact?

    Here's a fact that we all know, The Eiffel tower is in Paris

    Now, if one is ignorant of this fact i.e. one doesn't know that the Eiffel tower is in Paris, this state of ignorance can be represented with the following possibilities (alternatives) [uncertainty]

    The Eiffel tower is NY, OR The Eiffel Tower is in Beijing OR The Eiffel tower is in Sydney OR...

    Once you come by the information that the Eiffel tower is in Paris, the uncertainty becomes 0. Isn't that just fantastic? That's what I meant when I said Shannon's theory "...feels natural and intuitive..." Mind you, Shannon's theory is probably just one of many other ways to approach the subject of information but it, for certain, captures the uncertainty aspect.

    Thank you for your discussion.
  • Gnomon
    3.7k
    If the accepted "information theory" represents information in a way other than the way that we normally use the word "information", and cannot account for the existence of information, according to how we normally use the word, as that which is transmitted in a message, then surely we are justified in "raising philosophical objections to it".Metaphysician Undercover
    Charged with maximizing the flow of communication, Shannon was interested in measuring the carrying capacity of the system, not the meaningful content of each message. That's like a shipping company, which is more interested in the potential (carrying capacity) of its empty vessels, while the shippers are interested in the cash-value (meaning) of the actual cargo.

    Toward that end, Shannon focused on the Syntax of Information (structure ; volume) instead of its Semantics (meaning ; content). Ironically, he measured Information capacity in terms of emptiness & negation (Entropy), instead of its fullness & positive aspects (Energy). Even more ironically, scientists have referred to those purposeful features as "negentropy" (negative negation). Likewise, scientists focus on the "uncertainty" of information, rather than its "novelty". But it's the unexpected that is most meaningful to humans. So, I agree that philosophers have good reasons to "raise objections".

    "Information", as Shannon defined it, is akin to Fuzzy Logic, which is ambiguous & uncertain, but -- like the Enigma code -- capable of carrying almost infinite values : between 0 and 100%. By reducing Specificity, it maximizes Potential. Hence, each bit/byte, instead of carrying meaning, is an empty container capable of carrying multiple meanings. That kind of communication is good for computers -- where the translation code-key is built in -- but not for people, who can't handle uncertainty & ambiguity.

    That's why neuroscientist & anthropologist Terrence Deacon said, "this is evidence that we are both woefully ignorant of a fundamental causal principle in the universe and in desperate need of such a theory". The Enformationism thesis is my contribution toward that end. :smile:


    Negentropy : Negentropy is reverse entropy. It means things becoming more in order. By 'order' is meant organisation, structure and function: the opposite of randomness or chaos.
    Note -- I give it a more positive name : "Enformy" -- meaning the power to enform, to create novelty.

    Fuzzy Logic :
    Fuzzy logic is a form of many-valued logic in which the truth values of variables may be any real number between 0 and 1. It is employed to handle the concept of partial truth, where the truth value may range between completely true and completely false.
    https://en.wikipedia.org/wiki/Fuzzy_logic

    Enformy :
    In the Enformationism theory, Enformy is a hypothetical, holistic, metaphysical, natural trend, opposite to that of Entropy & Randomness, to produce Complexity & Progress. It is the mysterious tendency for aimless energy to occasionally create the stable, but temporary, patterns we call Matter, Life, and MInd.
    http://bothandblog2.enformationism.info/page18.html
  • Gnomon
    3.7k
    What is this "common usage" of "information" that you speak of?TheMadFool
    See my reply to above. :smile:

    Information : For humans, Information has the semantic quality of aboutness , that we interpret as meaning. In computer science though, Information is treated as meaningless, which makes its mathe-matical value more certain. It becomes meaningful only when a sentient Self interprets it as such.
    http://blog-glossary.enformationism.info/page11.html
  • Metaphysician Undercover
    13.1k
    orry, but you seem to be contradicting yourself. Please go over your posts again.TheMadFool

    I think you must have misunderstood. If you perceive a contradiction, then point it out to me so I can see what you're talking about, and maybe clarify what I meant.

    Are you implying we can cope with uncertainty?TheMadFool

    Yes, of course I am saying that, that's what I said in my first post in the thread, uncertainty is a fundamental aspect of language use, and we clearly cope with it.

    Uncertainty, ambiguity being one of its causes, comes with the territory and it can't be, to my reckoning, dealt with in a satisfactory manner by any theory of information, whether based on certainty or uncertainty, let alone Claude Shannon's. So, your criticism is more appropriate for language than Shannon's theory.TheMadFool

    Yes, but that's the point, if a theory of information cannot, in a satisfactory manner, deal with the uncertainty which is inherent within language use, then how can it claim to be about "information" as the word is commonly used? The word as it is commonly used includes information exchanged in language use.

    What is this "common usage" of "information" that you speak of?

    Google gives the following definition of information: facts provided or learned about something or someone
    TheMadFool

    Right, so doesn't "information" include what is transferred in language use, as something provided or learned about something?

    Once you come by the information that the Eiffel tower is in Paris, the uncertainty becomes 0.TheMadFool

    This is not true though, because if someone convinces me that "the Eiffel tower is in Paris" is a true piece of information, I still might not have any clue as to what the Eiffel tower is, or what Paris is. So your claim, that uncertainty is at 0 is just an illusion, because I could just claim to be certain that there is something called the Eiffel tower, in some place called Paris, but since I haven't a clue what this thing is, or where this place is, it cannot be said to be any true form of certainty. So Shannon's theory doesn't really deal with the true nature of information at all.



    Shannon's theory is probably just one of many other ways to approach the subject of information but it, for certain, captures the uncertainty aspect.TheMadFool

    No, it really doesn't capture the uncertainty aspect of information, as I've explained. It does recognize that uncertainty is an essential aspect of information, as I described in my first post, but it does not provide any insight into how uncertainty is dealt with by the human mind in natural language use. So it's not really a very good way to approach the subject of information.

    Charged with maximizing the flow of communication, Shannon was interested in measuring the carrying capacity of the system, not the meaningful content of each message. That's like a shipping company, which is more interested in the potential (carrying capacity) of its empty vessels, while the shippers are interested in the cash-value (meaning) of the actual cargo.Gnomon

    Clearly, the content is what is important, and referred to as the information. So if someone figures out a way to put the same information into one package which requires a hundred packages in Shannon's system, then his measurement system is not very good.


    Toward that end, Shannon focused on the Syntax of Information (structure ; volume) instead of its Semantics (meaning ; content).Gnomon

    So, this is what I pointed out to Madfool, what we commonly refer to as "information" is the content, the meaning, not the structure. So Shannon's "theory of information" really doesn't deal with information, as what is referred to when we normally use the word.

    Specificity, it maximizes Potential. Hence, each bit/byte, instead of carrying meaning, is an empty container capable of carrying multiple meanings. That kind of communication is good for computers -- where the translation code-key is built in -- but not for people, who can't handle uncertainty & ambiguity.Gnomon

    I would say that you might have this backward. The computer can't handle uncertainty, that's why there must be a built-in code-key to eliminate any uncertainty. People, having free will choice have no such built-in code-key, and that capacity to choose regardless of uncertainty, allows them to live with and cope with ambiguity.
  • Gnomon
    3.7k
    I would say that you might have this backward. The computer can't handle uncertainty, that's why there must be a built-in code-key to eliminate any uncertainty. People, having free will choice have no such built-in code-key, and that capacity to choose regardless of uncertainty, allows them to live with and cope with ambiguity.Metaphysician Undercover
    I agree with your version, but what I said was that "by reducing specificity" -- which increases generality -- Shannon's definition of Information "maximizes the Potential" carrying capacity (bandwidth) of a transmission. That was the point of his research. By using only an austere two digit code, instead of noisy redundant human languages, he was able to compress more information into the same pipes. Just as with Morse code though, the specific meaning is restored by translating the abstract code back into a concrete language. Only then, does it become Actual Information -- meaning in a mind; actionable knowledge.

    In the shipping analogy, Shannon didn't make the ships bigger, he made the cargo smaller -- by reducing redundancy, as noted by TMF. Thus, increasing the carrying capacity at no extra cost to the shippers. But, in this thread, that's a minor point. What really matters is that by using an abstract code -- stripped of meaning -- he overcame a major technical hurdle : bandwidth. But in order for the code to be meaningful to humans, it must be decompressed and converted back into noisy redundant "natural" language. Unfortunately, his new terminology. equating "Information" with destructive Entropy, diverted attention away from the original constructive essence of Information : aboutness -- the relation between subject & object. :smile:

    Natural Language : In neuropsychology, linguistics, and the philosophy of language, a natural language or ordinary language is any language that has evolved naturally in humans through use and repetition without conscious planning or premeditation. Natural languages can take different forms, such as speech or signing.

    Aboutness : https://press.princeton.edu/books/hardcover/9780691144955/aboutness
  • Wayfarer
    22.3k
    You might be interested in this academic. He sounds a bit fringe to me, but I have to admit, his electromagnetic theory of consciousness seems plausible (although I must confess to scepticism about anything authored by someone who calls themselves 'Johnjoe'. :worry: )

    About Shannon's theory - I can't help but feel too much is being read into it. Shannon was a communications engineer, first and foremost, and the problem he set out to solve had some very specific boundary conditions. It was a theory about converting words and images into binary digits - as the article notes, Shannon might have coined the term 'bit' for 'binary digit' - and transmitting them through a medium. Why it is now taken to have a profound meaning about the nature of reality baffles me a little.
  • TheMadFool
    13.8k
    I think you must have misunderstood. If you perceive a contradiction, then point it out to me so I can see what you're talking about, and maybe clarify what I meant.Metaphysician Undercover

    You can ignore my post.

    Yes, of course I am saying that, that's what I said in my first post in the thread, uncertainty is a fundamental aspect of language use, and we clearly cope with it.Metaphysician Undercover

    Thanks for bringing up the issue of ambiguity because it lies at the heart of Shannon's theory on information. Thanks. Have a G'day.

    I'd like to pick your brains about something though.

    If I don't have the information on who invented the internet, does it seem ok to represent my lack of information as: Mark Zuckerberg OR Jeff Bezos OR Vint Cerf OR Bill Gates?
  • Wayfarer
    22.3k
    It grew out of AARPANET which was a US Defence Dept initiative launched in the 1960’s. The problem that needed to be solved was in response to the Soviet nuclear threat - what would happen if they nuked the Pentagon’s switchboard, say? There had to be a way of setting up a comms system that didn’t rely on a central switching mechanism, because it could always be blown up. Hence the idea of breaking communications up into packets of data - each of which was like an individual letter that could navigate from sender to receiver without relying on a central comms systems. That was TCP/IP - Tranmission Control Protocol/Internet Protocol which this whole shebang runs on. And that honour goes to Vint Cerf and Bob Kahn who were awarded a medal for it.

    _41003176_medal_cerf203ap.jpg

    (No, W. is not garotting him.)

    Gates of course played his part but the other big name is Tim Berners-Lee (now Sir) who invented hypertext and the WWW in 1992 or so ( can't remember the exact date).

    Oh - but TCP/IP is an absolutely amazing invention, one of the all-time great inventions, in my humble opinion. It has a lot of security vulnerability issues, although as Cerf often says, it was never designed for the amazing lengths that it’s been extended to.

    Have a read of Everything is Connected. It was an output from my first tech writing contract, which was for a company that helped introduce broadband modems into the Australian market. It has a good little primer on the invention of TCP/IP, page 13.
  • TheMadFool
    13.8k
    :up:

    I remember starting a thread in the old forum which has now sadly become defunct about how questions could be reformulated as statements using the logical OR operator.

    So, the question, Q (question) = "Who started the thread titled QUANTA article on Claude Shannon?" could be rewritten with no loss of meaning as A (alternatives) = TheMadFool OR Metaphysician Undercover OR Gnomon OR Wayfarer

    Remember that a question is defined as a request for information and and as you can see in the above example a question can be reduced to a list of alternatives - there are 4 for the question Q above and Q can be expressed, without any loss of meaning, as A. At this point uncertainty enters the picture - we're not sure or we're uncertain as to which of the 4 possibilities is the correct one. If I now send you the message "Wayfarer", the uncertainty disappears and I have the information the question Q is asking for which means I've whittled down the possibilities (alternatives) from 4 (TheMadFool, Metaphysician Undercover, Gnomon or Wayfarer) in the beginning to 1 (Wayfarer) at the end and according the Claude Shannon the message "Wayfarer" consists of log2(4) = 2 bits of information.

    Note that since computer language consists of 1's and 0's, the message "Wayfarer" must be decomposed into binary e.g yes/no answers. The first information we need is "Does the name of the person who started this thread start with a letter that comes before the letter "P"?" The answer would be "no". The possibilities now narrow down to Wayfarer and TheMadFool. That accounts for 1 bit of information. The next question is, "Does the name of the person who started this thread begin with the letter "W"? The answer is "yes" and we've got the information. That's the other 1 bit of information. We've homed in on the information viz. Wayfarer using 2 bits.
  • Wayfarer
    22.3k
    Well explained.

    (There’s that archetypically American series of sounds - ‘dah dah d dah dah - dah dah!’ I once heard the derivation was from GI’s pulling up outside hairdressers in the Phillipines after they had beaten the Japanese and blowing the sequence on their Jeep horn - ‘shave and a haircut - TWO BITS’. Although that, of course, is analog.)
  • Metaphysician Undercover
    13.1k
    I agree with your version, but what I said was that "by reducing specificity" -- which increases generality -- Shannon's definition of Information "maximizes the Potential" carrying capacity (bandwidth) of a transmission. That was the point of his research. By using only an austere two digit code, instead of noisy redundant human languages, he was able to compress more information into the same pipes. Just as with Morse code though, the specific meaning is restored by translating the abstract code back into a concrete language. Only then, does it become Actual Information -- meaning in a mind; actionable knowledge.Gnomon

    I don't see how you can describe that as a matter of reducing specificity for an increase in generality. It's the very opposite of that. Reducing the number of possible characters to two, clearly reduces possibility, through an act of increased specificity. This is a reduction in generality. What it does is allow for a simpler coding and decoding process, with higher certainty, but at the cost of greatly reducing the possibilities for the information which can be transmitted.

    But in order for the code to be meaningful to humans, it must be decompressed and converted back into noisy redundant "natural" language.Gnomon

    Yes this is exactly the evidence that what is called "information" in "information theory", is is one step removed from what we actually call "information" in common language use. Thinking that this is an accurate representation of "information", is the problem of representation, or narrative, which Plato warned us about. We have three layers, the real natural thing, the artificial thing which goes by the same name, but is just a shallow reflection of the thing, and then our understanding of the thing. If we look at the artificial thing, the shallow reflection, as if it is the real thing, we have a misunderstanding.

    The problem is widespread in our society. We look at the wavefunction for example as if it is the real thing, rather than a simple representation. It becomes extremely evident in issues of morality. We judge people according to their actions, but the actions are just a reflection of the person's psyche. To study human actions therefore, does not give us what is required to understand morality, we must study the human psyche directly.

    If I don't have the information on who invented the internet, does it seem ok to represent my lack of information as: Mark Zuckerberg OR Jeff Bezos OR Vint Cerf OR Bill Gates?TheMadFool

    No I don't think that would be correct. If the correct answer might be 1&2&3&4, you cannot represent it properly as 1 or 2 or 3 or 4. You need to have reason to know that it is either/or, which you don't give.

    Remember that a question is defined as a request for information and and as you can see in the above example a question can be reduced to a list of alternatives - there are 4 for the question Q above and Q can be expressed, without any loss of meaning, as A.TheMadFool

    The problem with this approach is that you need to know that the correct answer is within the list of options, which will only occur if you already know the correct answer. So such a request for information has an extremely limited applicability, like a multiple choice exam.
  • tim wood
    9.2k
    I don't see how you can describe that as a matter of reducing specificity for an increase in generality.Metaphysician Undercover
    Small point, likely you know this. Shannon made clear that part of efficient encoding was what the prior understandings of the sender and recipient could contribute. As example - as you-all have observed - there is often redundancy in messages. He argued that the recipient of the message often did not need the redundancy. Taking this to the limit, he wrote that the entropy of the English language was less than two bits(!). I believe an implied qualification was that the message be sufficiently long, but maybe not.

    Twenty-six characters ordinarily would require five bits for 2^5=32 options. Somewhat wasteful. But q doesn't really need its u. And so forth, down to less than two bits. And I find this online.

    " Shannon (1950) estimated the entropy of written English to be between 0.6 and 1.3 bits per character (bpc), based on the ability of human subjects to guess successive characters in text. Simulations to determine the empirical relationship between the provable bounds and the known entropies of various models suggest that the actual value is 1.1 bpc or less." http://mattmahoney.net/dc/entropy1.html
  • TheMadFool
    13.8k
    No I don't think that would be correct. If the correct answer might be 1&2&3&4, you cannot represent it properly as 1 or 2 or 3 or 4. You need to have reason to know that it is either/or, which you don't giveMetaphysician Undercover

    I'm working under the assumption that only one alternative will be correct and the Shannon's logic works perfectly well in that case. As for the possibility of 1&2&3&4, the question or uncertainty would need to be reframed like so: 1&2&3&4 OR 5 OR 7&8 OR... As you can see all questions the uncertainty of the anwer can be reexpressed as a disjunction.

    The problem with this approach is that you need to know that the correct answer is within the list of options, which will only occur if you already know the correct answer. So such a request for information has an extremely limited applicability, like a multiple choice exam.Metaphysician Undercover

    Yes but what's wrong with that? Shannon's theory is about messages and their information content - how the message containing the information brings our uncertainty regarding the answer to a question [a request for information] to zero; another way of saying that is eliminating alternative answers to the question until we're left with only one - the correct answer.

    As for your claim that "...such a request for information has an extremely limited applicability..." think of how the very foundation of the modern world - science - operates. A specific set of observations are made and not one but multiple hypotheses are formulated as an explanation. One of the stages in the scientific method is the question, "which hypothesis is true/best?" Hypothesis 1 OR Hypothesis 2 OR...Hypothesis N? Shannon's uncertainty shows up again and in vital area of humanity's experience with information. Here too, we must eliminate alternatives until we arrive at the best hypothesis.

    In fact, this very discussion that we're having is based on the uncertainty of whether what I'm saying is correct or not (1/0)? I feel it makes sense but you think otherwise precisely because we're unable to get our hands on the information that would settle the matter once and for all.
  • Gnomon
    3.7k
    You might be interested in this academic. He sounds a bit fringe to me, but I have to admit, his electromagnetic theory of consciousness seems plausible (although I must confess to scepticism about anything authored by someone who calls themselves 'Johnjoe'. :worry: )Wayfarer
    Unfortunately, that's his real name. And he is fringey, in the sense of revolutionary. I have read a Kindle copy of his book, Quantum Evolution, because it seemed have some parallels to my own edgey Enformationism thesis of how evolution works. He concluded that there seemed to a "force of will" behind biological evolution. And I have concluded that the Generic Form of Information -- that I call EnFormAction -- is poetically analogous to the Will-of-God in religious myths of creation. So, I find his combination of Quantum Theory and Biology to be interesting -- and provocative, if not provable. But of course, it doesn't fit neatly into the dominant scientific worldview of Materialism.

    MeFadden's new theory of Electromagnetic Consciousness may also parallel some of my ideas of how Consciousness emerges from a biological brain. He "posits that consciousness is in fact the brain’s energy field". But I would go a step farther, to posit that Consciousness is an emergent quality of universal Information : a MindField, if you will. Physical Energy is merely a causal form of Generic Information. And the human Mind is a metaphysical effect, a Function, of information processing in the brain. By that, I mean Raw Energy is first transformed into active Life, and then into sensing Mind, and ultimately into knowing Consciousness. :smile:

    Quantum Evolution presents a revolutionary new scientific theory by asking: is there a force of will behind evolution?
    https://www.amazon.com/Quantum-Evolution-Multiverse-Johnjoe-McFadden/dp/0006551289/ref=sr_1_1?dchild=1&keywords=quantum+evolution&link_code=qs&qid=1609092555&sourceid=Mozilla-search&sr=8-1&tag=mozilla-20

    Generic Information : Information is Generic in the sense of generating all forms from a formless pool of possibility -- the Platonic Forms.
    http://bothandblog2.enformationism.info/page29.html
    Note -- this use of "Generic" is not based on the common dictionary definition, but on the root meaning : "to generate novelty" or "to produce offspring".

    Shannon might have coined the term 'bit' for 'binary digit' - and transmitting them through a medium. Why it is now taken to have a profound meaning about the nature of reality baffles me a little.Wayfarer
    The profundity of Information Theory is only partly due to it's opening the door to the Information Age. But we have, since Shannon's re-definition of Mind Stuff, begun to go far beyond mere artificial computer brains, to glimpse an answer to the "hard question" of natural Consciousness. Shannon's narrow definition of "Information" is blossoming into a whole new worldview. :wink:


    We live in the information age, which according to Wikipedia is a period in human history characterized by the shift from industrial production to one based on information and computerization. . . . So it is not entirely crazy to speculate about what might lie beyond the information age.
    https://www.wired.com/insights/2014/06/beyond-information-age/
  • Gnomon
    3.7k
    I don't see how you can describe that as a matter of reducing specificity for an increase in generality. It's the very opposite of that.Metaphysician Undercover
    Sorry for the confusion. As an amateur philosopher, I'm in over my head. But, if you have any interest in a deeper discussion of what I'm talking about, I can direct you to several books by physicist Paul Davies, and associates, who are exploring the concept of Information far beyond Shannon's novel use of the old word for personal-Knowledge-encoded-in-a-physical-brain to a new application of abstract-Values-encoded-in-the-meaningless-mathematics-of-Probability. :brow:

    Paul Davies : https://www.amazon.com/s?k=paul+davies&link_code=qs&sourceid=Mozilla-search&tag=mozilla-20

    Thinking that this is an accurate representation of "information", is the problem of representation, or narrative, which Plato warned us about. We have three layers, the real natural thing, the artificial thing which goes by the same name, but is just a shallow reflection of the thingMetaphysician Undercover
    Apparently, I haven't clearly conveyed that my intention is to understand "the real natural thing" instead of "the artificial thing which goes by the same name". Don't worry about the "specificity" and "generality" of information. That's a tricky technical distinction for information specialists to gnaw on. For the rest of us, the important distinction is between statistical Probability and meaningful Aboutness. :cool:



    Information :
    * Claude Shannon quantified Information not as useful ideas, but as a mathematical ratio between meaningful order (1) and meaningless disorder (0); between knowledge (1) and ignorance (0). So, that meaningful mind-stuff exists in the limbo-land of statistics, producing effects on reality while having no sensory physical properties. We know it exists ideally, only by detecting its effects in the real world.
    * For humans, Information has the semantic quality of aboutness , that we interpret as meaning. In computer science though, Information is treated as meaningless, which makes its mathematical value more certain. It becomes meaningful only when a sentient Self interprets it as such.
    * When spelled with an “I”, Information is a noun, referring to data & things. When spelled with an “E”, Enformation is a verb, referring to energy and processes.

    http://blog-glossary.enformationism.info/page11.html
  • unenlightened
    9.2k
    And what I say three times is true. — The Bellman
    (The Hunting of the Snark)

    Context approximately equals language game, approximately equals background knowledge/prior agreement/protocol/ etc etc.

    Shannon's context is information transfer: I post - you read. We are used to a faithfully exact transfer; that what you read is exactly what I wrote, complete with typos. We are used in this context, to the elimination of noise. And this is done by the application of Shannon's theory.

    But if one downloads a large app, one generally 'verifies' it because there is always some noise and thus the possibility of a 'wrong' bit. Verification uses redundancy to (hopefully) eliminate noise. And the Bellman does the same thing. Saying everything three times is very redundant, but reduces 'noise' if one compares the three versions and takes the average. A checksum does some of the job at much less informational cost, in the sense that if the checksum matches the probability of error is vanishingly small, but if it fails to match, there is no way to correct, but one must begin again.

    Good writing somewhat tends to follow the Bellman's method; the introduction sets out what it to be said, the body of the piece says it, and the conclusion says what has been said. Redundancy is somewhat misnamed, because it helps reduce misunderstanding.

    So, now imagine as analogy to the rain in the Sahara an array of 100 * 100 pixels - black or white, 0 or1.

    There are 2^10,000 possible pictures. That is a large number. But now consider the Saharan version, where we know that almost all the pixels are black, or 0. Obviously, one does not bother to specify all the dry days in the Sahara, one gives the date of the exceptional wet day, and says, "else dry".

    In the same way, any regularity that might predominate, stripes, chequerboards or whatever, can be specified and then any deviations noted. This is called compression, and is the elimination of redundancy. The elimination of redundancy is equivalent to the elimination of order. Maximum compression results in a message that is maximally disordered and thus looks exactly like noise.

    This then explains the rather counter-intuitive finding that disordered systems contain more information than ordered systems and thus that entropy reduces available energy but increases information. One proceeds from the simple 'sun hot, everything else cold', to a much more complex, but essentially dull 'lukewarm heat death of everything'.
  • Metaphysician Undercover
    13.1k
    I'm working under the assumption that only one alternative will be correct and the Shannon's logic works perfectly well in that case. As for the possibility of 1&2&3&4, the question or uncertainty would need to be reframed like so: 1&2&3&4 OR 5 OR 7&8 OR... As you can see all questions the uncertainty of the anwer can be reexpressed as a disjunction.TheMadFool

    The problem is, that you must be sure that the correct answer is amongst the options, or else the question is not necessarily a valid representation.

    Yes but what's wrong with that? Shannon's theory is about messages and their information content - how the message containing the information brings our uncertainty regarding the answer to a question [a request for information] to zero; another way of saying that is eliminating alternative answers to the question until we're left with only one - the correct answer.TheMadFool

    If you already know that the correct answer is amongst the option then the uncertainty is not a real uncertainty. Therefore this system is based in a false representation of uncertainty.

    As for your claim that "...such a request for information has an extremely limited applicability..." think of how the very foundation of the modern world - science - operates. A specific set of observations are made and not one but multiple hypotheses are formulated as an explanation. One of the stages in the scientific method is the question, "which hypothesis is true/best?" Hypothesis 1 OR Hypothesis 2 OR...Hypothesis N? Shannon's uncertainty shows up again and in vital area of humanity's experience with information. Here too, we must eliminate alternatives until we arrive at the best hypothesis.TheMadFool

    This is completely different, because there is no guarantee that one of the hypotheses is the correct one. So concluding that one is "better" than the others does not guarantee that it is correct. And when we make such a judgement according to what is better, then it is implied that there is a specific purpose for the sake of which it is better. Therefore we forfeit "true" for "better". So now you've corrupted your principles from being a question of what is correct, 1 or 2 or 3 or 4, implying that one of them must be correct, to a question of which is best, 1 or 2 or 3 or 4, implying that there is no necessity of one being correct, but one might be better than the others, depending on what the hypothesis would be used for.

    Apparently, I haven't clearly conveyed that my intention is to understand "the real natural thing" instead of "the artificial thing which goes by the same name".Gnomon

    No, no, I recognize that you're beyond this, and i was agreeing with you on this point, offering Plato as an example to support what you were saying. What is "information" to a machine is completely different from what is "information' to a human being, because as you say, the machine information still must be translated to human language in order to really make sense. so there is an extra level of separation. I think that what happens is that at each distinct level there is an inversion of importance, from the particular to the general, and then back again when you cross the next level.
  • Wayfarer
    22.3k
    Shannon's context is information transfer: I post - you read. We are used to a faithfully exact transfer; that what you read is exactly what I wrote, complete with typos. We are used in this context, to the elimination of noise. And this is done by the application of Shannon's theory.unenlightened

    Right - in my view, it does a lot, but it doesn't account for Life, the Universe, and Everything.

    Raw Energy is first transformed into active Life, and then into sensing Mind, and ultimately into knowing Consciousness.Gnomon

    Transformed by what, and how?

    In Spinoza's philosophy, which I'll take to be paradigmatic for philosophy generally in this case, the only real substance ('substance' being nearer in meaning to 'subject' or to 'being' than the current conception of 'substance') is self-caused, it exists in itself and through itself. In other words, it is not derived from anything, whereas everything else is derived from that. (This is Spinoza's doctrine of God as nature.)

    Quite what the 'uncaused cause' is, is forever a mystery - even to Himself, according to tradition. So, how do we know that? In a way, we can never know - except for revelation, which I understand you've already rejected (//or by divine illumination, or mystical union//). So, instead, take a representative sample of philosophies which point to 'that', and say that 'that' must be conceived in terms of 'information'.

    I've been perusing some articles on Shannon and found some reviews of, and excerpts from, James Gleick's book The Information (an excerpt here), which seems like one of the ur-texts for this set of ideas.

    I note one of the reviewers says of his book:

    In an epilogue called “The Return of Meaning,” Gleick argues that to understand how information gives rise to belief and knowledge, we have to renounce Shannon’s “ruthless sacrifice of meaning,” which required jettisoning “the very quality that gives information its value.” But Shannon wasn’t sacrificing meaning so much as ignoring it, in the same way that a traffic engineer doesn’t care what, if anything, the trucks on the highway are carrying. Once you start to think of information as something meaningful, you have to untether it from its mathematical definition, which leaves you with nothing to go on but the word itself. And in its ordinary usage, “information” is a hard word to get a handle on (even after a recent revision, the Oxford English Dictionary still makes a hash of its history). It’s one of those words, like “objectivity” and “literacy,” that enable us to slip from one meaning to the next without letting on, even to ourselves, that we’ve changed the subject.

    :up:
  • Gnomon
    3.7k
    I think that what happens is that at each distinct level there is an inversion of importance, from the particular to the general, and then back again when you cross the next level.Metaphysician Undercover
    Yeah! That's the ticket : "Inversion" -- a mental flip of the coin. When I said that Shannon's Information substituted "generality" for "specificity", I was referring to the meaning of communication. Shannon's technique was to eliminate the specific intended meaning of Words for enigmatic numerical Bytes. Digital information is conveyed in the abstract language of binary numbers that have the potential to encode any meaning. It's a sort of universal language. But Mathematics is divorced from concrete Reality, in that it is universal instead of specific. That's why String Theory makes sense to mathematicians, and not to laymen, but cannot be empirically tested in the real world.

    Therefore, in order to be meaningful to non-computers, that general (one size fits all) language must be translated (inverted) back into a single human language with a narrowly-defined (specified) range of meanings for each word. In its encoded form, the message is scrambled into apparently random noise, that could mean anything (1) or nothing (0). Ironically though, even chaotic randomness contains some orderly potential. And Shannon found the key to unlock that hidden Meaning in Boolean Algebra, which boils Significance down to its essence : 1 = True (meaningful) or 0 = False (meaningless).

    So, as you said, Shannon "inverted the importance" of Meaning in order to compress it down to its minimum volume. But the communication is not complete until it is converted back into verbose messy, often ambiguous, human language. Individually, the ones & zeros mean nothing more complex than the simple dichotomy of Either/Or. And that is also the ultimate goal of objective reductive physical Science. But subjective holistic metaphysical Philosophy's goal is to restore the personal meaning of knowledge.

    Shannon's reductive method : By focusing relentlessly on the essential feature of a problem while ignoring all other aspects.
    https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/

    Physics & Metaphysics :
    Two sides of the same coin we call Reality. When we look for matters of fact, we see physics. But when we search for meaning, we find meta-physics. A mental flip is required to view the other side. And imagination is necessary to see both at the same time.
    http://blog-glossary.enformationism.info/page14.html

    Universal Language : https://www.bhp.com/sustainability/community/community-news/2017/07/learning-the-universal-language/
  • Gnomon
    3.7k
    Raw Energy is first transformed into active Life, and then into sensing Mind, and ultimately into knowing Consciousness. — Gnomon
    Transformed by what, and how?
    Wayfarer
    The world-creating Potential of the Big Bang Singularity was transformed (enformed) into Life, the Universe, and Everything by the power of EnFormAction. This is a novel notion, perhaps even radical. But it is being studied by serious scientists -- some of whom even entertain the taboo concept of Deity, or Panpsychism. I have simply translated that unconventional interpretation of Generic Information into a new myth of creation, that I call Enformationism. This is based on Einstein's theory of E = MC^2, and the current understanding of physicists that Information transforms into Energy, which transforms into Matter, and vice versa. See the Hypothesis below for the "how". :nerd:

    The mass-energy-information equivalence principle :
    https://aip.scitation.org/doi/10.1063/1.5123794

    Generic Information : Information is Generic in the sense of generating all forms from a formless pool of possibility -- the Platonic Forms.
    https://enformationism.info/phpBB3/viewtopic.php?f=3&p=837#p837

    EnFormAction :
    Ententional Causation. A proposed metaphysical law of the universe that causes random interactions between forces and particles to produce novel & stable arrangements of matter & energy. It’s the creative force of the axiomatic eternal deity that, for unknown reasons, programmed a Singularity to suddenly burst into our reality from an infinite source of possibility. AKA : The creative power of Evolution; the power to enform; Logos; Change.
    http://blog-glossary.enformationism.info/page8.html

    The EnFormAction Hypothesis : http://bothandblog3.enformationism.info/page23.html
  • Gnomon
    3.7k
    In Spinoza's philosophy, which I'll take to be paradigmatic for philosophy generally in this case, the only real substance ('substance' being nearer in meaning to 'subject' or to 'being' than the current conception of 'substance') is self-caused, it exists in itself and through itself. In other words, it is not derived from anything, whereas everything else is derived from that. (This is Spinoza's doctrine of God as nature.)Wayfarer
    Yes. My Enformationism thesis can be viewed as an update of Spinoza's worldview, in light of Quantum Physics, bottom-up Evolution, and Information Theory. :smile:

    Spinoza's Universal Substance : Like Energy, Information is the universal active agent of the cosmos. Like Spinoza's God, Information appears to be the single substance of the whole World.
    http://bothandblog2.enformationism.info/page29.html
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.