• Benj96
    2.3k
    The greatest degree of information is found in the most random or irrational sequences.

    I find this strange and counter intuitive.

    Randomness seems aimless, useless. Entropy is disorder. And this seems so unpredictable and chaotic that it couldn't possibly hold much information compared to systems like life with organised sequences of DNA that confer complexity, and the ability to process information in order to survive etc.

    Yet, sequences that are predictable can be compressed to a simple formula which take up little storage.

    Irrational numbers cannot. And they're infinite.
    So of the two, there's more information in entropy than there is in order.

    If you take pi or the golden ratio or eulers number for example, eventually it will detail your entire genetic sequence from start to finish. Statistically, given its randomness and infinity, it mist contain this information at some point in its course.

    That gives me goosebumps.
  • Patterner
    1k
    I have read a few descriptions of Shannon's work, and cannot get the gist of things. (Sadly, same with Godel. Very unhappy at my apparent inability.) But I believe the idea that you bring up is that the most random sequences CAN produce the most information. If you know a sources can produce only strings of Xs, you will get no information. You will never get anything other than what you expect, which is the only thing you CAN get.

    But if the source can produce ANYTHING, then you can never know what you will receive, and what you receive will be full of information.
  • Wayfarer
    22.6k
    The greatest degree of information is found in the most random or irrational sequences.Benj96

    That's a misleading way of putting it. Random strings of characters are impossible to compress because they contain no order. An ordered string, say a sentence, folllows rules, which enables you to compress it because the rules can be used to eliminate redundancies or repetition. Compression relies on patterns, repetition, and predictability. If a sequence follows a set of rules, as ordered strings like sentences do, redundancies can be identified and used to compress the data. In essence, the more predictable or ordered a sequence, the more it can be compressed, reducing its entropy. Whereas a string of several thousand random characters can't be compressed at all - but, so what? That doesn't mean it contains 'more information'. It means it's harder to compress.

    I think you might be applying Shannon's theory beyond its intended scope. Shannon's theory is fundamentally a mathematical framework designed to optimize the encoding and transmission of data over communication channels. It deals with the quantity of information, measuring how much information is present or how efficiently it can be transmitted and stored, without regard to the content's meaning or significance. It's tremendously fashionable to try and extract a metaphysic from it, but it really isn't one.
  • ssu
    8.6k
    The greatest degree of information is found in the most random or irrational sequences.Benj96
    ?

    If you take pi or the golden ratio or eulers number for example, eventually it will detail your entire genetic sequence from start to finish. Statistically, given its randomness and infinity, it mist contain this information at some point in its course.Benj96
    Ah, I think that this is the finding that infinite strings as being infinite also then do have the text of Tolstoi's "War and Peace" written in binary code...because there infinite.

    But this is a statistical probability. And notice when you have something infinite, then you have a problem with statistical probabilities.

    Yet this "information" on a random sequence is simply useless, because of the simple fact that you cannot handle infinite sequences. You can handle only finite parts of any random sequence. Thus finding your genetic sequence or Tolstoi's "War and Peace" in binary form or both from some finite part of a random sequence is, well, we can round that probability to 0, even if it's a very small positive number. Even if finite parts can be quite big. Now finding an exact sequence in very large finite strings is an interesting discussion itself...

    And can there be a random sequence that doesn't have your genetic sequence and Tolstoi's "War an Peace" in binary form? I guess so. Can I prove it? Absolutely not! With randomness you slip easily to the part of mathematics that is unprovable, uncountable, etc. Yet it's still math.

    Hence this is a bit of an illusion, I would say.
  • flannel jesus
    1.8k
    I am not sure a random number contains "information" necessarily just because some of its random sequence matches something else. Information is only useful if it's accessible - if pi contains your genetic sequence, that's not usable information because you have no idea where in pi to look for it. Can it be classed as "information" within pi if there's no plausible way for someone who wants that information to access it?

    Pi contains the information about the ratio of diameter to circumference - I'm not convinced of the other type of information.
  • ssu
    8.6k
    Pi contains the information about the ratio of diameter to circumference - I'm not convinced of the other type of information.flannel jesus
    At least that pi is a transcendental number means you cannot square the circle as it otherwise could be possible, if it would be rational number (or Real Algebraic number). So there's that information (if I got it right).

    Here perhaps the separation of 'information' and simple raw 'data' might help.

    A random string might have by random (how else?) a string of data that looks similar to 'information'. Like the binary string “01000001" is the digital code for "A". And we might find in some random sequence
    “01000001" and say "Hey! This random sequence has "A" in it."
  • flannel jesus
    1.8k
    Yeah, that's kinda what I mean by usable. You can find the information in there, *if you already know exactly what the information you're looking for looks like*, but if you already know that, you don't need to look at pi to get it. You already have it.

    Usable information shouldn't require you to already know the information you're looking for in order to find it, right? It would be like a dictionary that requires you to know the definition of a word in order to find the definition of a word. That wouldn't be a useful dictionary at all
  • Patterner
    1k
    Also, pi isn't random.
  • flannel jesus
    1.8k
    they're random in one sense and not in another. They're not RANDOM random, but they're distributed as if they were random and unpredictable ahead of time like they would be if they were random.

    Pi is effectively a seeded random number generator. A deterministic-yet-chaotic generator of digits.
  • Patterner
    1k

    We call it random, even though, no matter the size of the circle, we don't have to predict, we absolutely know what any digit will be?
  • flannel jesus
    1.8k
    there's more than one flavour of random. That's why I'm comparing it to a seeded random generator, and bringing up chaos - chaotic deterministic systems can be "effectively random" even if they're not actually genuinely random
  • Apustimelogist
    584
    It depends what you mean by information. The word I think is used so loosely that you may be able to solve this issue purely by working out your semantics.

    I think Shannon entropy can be described in terms of either an observer's uncertainty about the outcome they will get out of some system/random variable, or in terms of the kind message-generating capacity of that system (more messages it can produce, the greater the entropy).

    I guess the last description could reasonably be a way of describing how we think of information but information as we semantically use it is also about the notion of reducing uncertainty. The more information you have gained from an observation, the more uncertainty you have reduced. So in that sense it could also be conceptualized as almost the inverse of entropy.

    Either way, I guess a key point is that when hearing people talk about information with regard to entropy, one should interpret them as talking just about the mathematical meaning of entropy first and foremost in order to understand what they are saying, rather than paying attention to the word 'information' which is often not being used in any specific way other than to refer to the mathematical usage. Care needs to be taken moving from the mathematical notion to the casual semantics of information which may be very different.

    (Edited: just clarified some bits in the last section, no pun intended)
  • Count Timothy von Icarus
    2.8k
    There seems to be two different conceptions of information being mixed together here.

    The shortest possible way to write a program that produces a given string is called its Kolmogorov Complexity or algorithmic entropy. Shannon Entropy by contrast take a given string and then measures the amount surprise in it, based on how likely the string is compared to some background distribution.

    Questions about how to view statistics (frequentism vs propensity vs Bayesianism vs logicalism, etc.) affect how we interpret information.

    From the perspective of Shannon Entropy, you might say that a computation that outputs pi up to some very high number of decimals produces zero information. Even though there is a very large number of digits, they all occur where they occur with a probability equal to 100% given the program input. This ties into the "scandal of deduction," the finding that deterministic computation/deduction produces no new information.

    It's also worth noting that the program that outputs pi doesn't really contain your genome. Information is necessarily relational. We could map your genome onto any string with a sufficient amount of variance, but such information doesn't exist "in itself."

    A simple random bit generator produces all possible finite strings given enough time, but that doesn't mean that the Kolmogorov Complexity of all strings is equal to the simplest random string generator. You need a program that will output some string, e.g. your genome, and JUST that thing. So, while a program that outputs pi might output very many possible encodings of your genome, the program still needs some way to recognize that encoding, halt, and output it. So the information in your genome isn't really "in" the program that outputs pi, anymore than a random string generator "contains" the information for all possible programs/strings.


    I wrote an article on this a while back for 1,000 Word Philosophy, although they weren't interested in the topic.

    https://medium.com/@tkbrown413/introducing-the-scandal-of-deduction-7ea893757f09

    And a deeper dive:

    https://medium.com/@tkbrown413/does-this-post-contain-any-information-3374612c1feb




    I think there is a ton of relevance to metaphysics, it's just that bad inferences are sometimes made. Paul Davies has a great anthology called "Information and the Nature of Reality," with entries by Seth Lloyd, Terrance Deacon, and others, including philosophers and theologians, that is quite good.

    Information theory has allowed for a unification of disparate fields, from physics to biology to economics to cognitive science. This alone makes it of philosophical merit, a set of general principles that has explanatory and predictive power across the social, life, and physical sciences.
  • Patterner
    1k
    I wrote an article on this a while back for 1,000 Word Philosophy, although they weren't interested in the topic.

    https://medium.com/@tkbrown413/introducing-the-scandal-of-deduction-7ea893757f09
    Count Timothy von Icarus

    Hey! Your last name isn't von Icarus!!


    :lol:
  • Wayfarer
    22.6k
    Paul Davies has a great anthology called "Information and the Nature of Reality," with entries by Seth Lloyd, Terrance Deacon, and others, including philosophers and theologians, that is quite good.Count Timothy von Icarus

    I'm reading his Demon in the Machine at the moment, and I've been reading Deacon. But I'm still dubious that 'information' has fundamental explanatory power - because it's not a metaphysical simple. The term 'information' is polysemic - it has many meanings, depending on its context.

    I have a suspicion that a famous Norbert Wiener quote, in Cybernetics, is behind a lot of this theorisation. He said “Information is information, not matter or energy. No materialism which does not admit this can survive at the present day." So what do we do? Admit it, and start developing a metaphysics, so-called, which accomodates it. And just as 'the machine' became the prevailing metaphor during the industrial era, so 'information' becomes the prevailing metaphor during the information age. But it's still entertained within a generally physicalist framework, at least for a lot of people (although maybe the times they really are a'changin'.)

    I guess a key point is that when hearing people talk about information with regard to entropy, one should interpret them as talking just about the mathematical meaning of entropy first and foremost in order to understand what they are saying, rather than paying attention to the word 'information' which is often not being used in any specific way other thsn to refer to the mathematical usage.Apustimelogist

    There's an amusing anecdote I often re-tell in this context about the connection between information and entropy. A famous conversation is said to have occurred in around 1941 in a discussion between Claude Shannon and John von Neumann, during Shannon’s postdoctoral research fellowship year at the Institute for Advanced Study, Princeton, New Jersey. Neumann was one of the main faculty members, and during this period Shannon was wavering on whether or not to call his new logarithmic statistical formulation of data in signal transmission ‘information’ or ‘uncertainty’ (of the Heisenberg type). Neumann suggested that Shannon rather use the ‘entropy’ of thermodynamics, because: (a) the statistical mechanics version of the entropy equations have the same mathematical isomorphism and (b) nobody really knows what entropy really is so he will have the advantage in winning any arguments that might occur (source).

    The concept of entropy also became combined with Erwin Schrodinger's 'negentropy' from his What is Life? lecture series. It's become part of the conceptual architecture of bio-informatic philosophy.
  • Patterner
    1k
    But I'm still dubious that 'information' has fundamental explanatory powerWayfarer
    I've been thinking about something lately. I assume it's part of some field of study or other, but I don't know enough to know which. I only thought along these lines after learning about Barbieri when I came here, so I guess it's a part of bio-semiotics.

    Some information is what I might call passive. Books are a good example. Books are filled with information. We know what the squiggles on the page represent, because we invented the information systems of language and writing. So when we read a book, we can take that information in, and learn many things.

    But that can always be, and often is, the end of it. We can entirely ignore and disregard the information we've gained. We can think about it, but still choose to not do anything related to it.

    DNA is a different kind of information. It represents chains of amino acids and proteins. It seems to compel action. Those amino acids and proteins are manufactured. The information is not interpreted by thinking beings like us, although we have come to be able to interpret this information system that we did not create. The things that interpret the information encoded in the base pairs don't seem to have a choice about whether or not to act on it. The information compels action. Instead of passive information, it is... what... Compulsory information? Whatever it's called, isn't that some explanatory power?
  • Wayfarer
    22.6k
    Whatever it's called, isn't that some explanatory power?Patterner

    100%. I guess I didn't explain myself very well. What I was getting at is the issue of treating information as if it is a reduction base, in the way that matter was supposed to have been by materialism. But of course DNA encodes biological information and that information is causative, there's no disputing that. But biological information is very different from the information encoded in binary on a computer, or written content.
  • Patterner
    1k
    I guess I didn't explain myself very well.Wayfarer
    Your audience, in this case, is more likely the problem. :lol:


    But biological information is very different from the information encoded in binary on a computer, or written content.Wayfarer
    Yes, very different. Our information is not compulsory. (I think I like active information better. But what is it actually called?) Have we created active information systems? That might help with making artificial consciousness.
  • L'éléphant
    1.6k
    I don't know about this OP. It is uncharacteristic of @Benj96 topics.


    The greatest degree of information is found in the most random or irrational sequences.Benj96
    But no support for was provided.
  • Gnomon
    3.8k
    The greatest degree of information is found in the most random or irrational sequences.
    I find this strange and counter intuitive.
    Benj96
    That common mis-understanding of Information theory is indeed counterintuitive, because we know from experience that randomness is the antithesis of meaning-bearing Information. But Shannon was not claiming that random sequences are inherently meaningful. Instead, he compared mental Information to physical Entropy. And noted that it is "surprising" to find meaningful information in random patterns*1. That eye-opening distinction of meaning from background noise is what semiotician & cyberneticist Bateson called "the difference that makes a difference"*2. . The first "difference" is the Surprise, and the second is the Meaning.

    According to the second law of thermodynamics, all order ultimately decays into disorder. And yet, here we stand on a tiny exception to that rule in the vast universe : the pocket of organization we call home. As far as we know, this is the only instance of Life & Mind in the universe. Ironically, some thinkers miss the exceptional nature of Information, and are still looking for communications from little green men, or the advanced race of San-Ti, out there in the near infinite crucible of random accidents. Information is not accidental. :nerd:

    *1. Information is the surprising exception to common randomness :
    The core idea of information theory is that the "informational value" of a communicated message depends on the degree to which the content of the message is surprising.
    https://en.wikipedia.org/wiki/Entropy_(information_theory)

    *2. Information as a difference :
    We propose a difference theory of information that extends Gregory Bateson’s definition that information is any difference that makes a difference.
    https://www.tandfonline.com/doi/full/10.1080/0960085X.2019.1581441

    CAN YOU SEE THE DIFFERENCE ?
    Surprising Signal within Meaningless Noise
    static.png
  • Wayfarer
    22.6k
    Instead, he compared mental Information to physical Entropy.Gnomon

    Did Shannon ever write or say anything about 'mental information'? And have you read the origin of how Shannon came to start using the term 'entropy' in relation to information, on the prompting of Von Neumann, who was a peer, and who said one of the advantages of using the term is because he would always win in arguments if he used it, because 'nobody knows what it means'?
  • jgill
    3.9k
    If you take pi or the golden ratio or eulers number for example, eventually it will detail your entire genetic sequence from start to finishBenj96

    Reference?
  • Wayfarer
    22.6k
    it’s being typed out by a million monkeys even as we speak ;-)
  • ssu
    8.6k
    I wrote an article on this a while back for 1,000 Word Philosophy, although they weren't interested in the topic.Count Timothy von Icarus

    Interesting read. I think there's one absolutely fundamental issue here at play to this discussion: as you point out in Introducing the Scandal of Deduction, conclusions have to be implicit in the premises. It's all quite deductional and as you say, "no new information is generated by deduction".

    Randomness cannot be so. Something random creates that "information" all time when the random process/string is continued. In fact I'm coming to the conclusion that something random has to follow a certain rule that it achieves Kolmogorov-Chaitin randomness/algorithmic entropy. The rule/algorithm is also used in the Turing Machines example or other undecidability results: negative self-reference. This can be seen from the complexity strings.

    A rational number like 7/11 in decimal form is 0,3636363636... and it has a very low complexity. To write it the decimal form you have the rule "After zero and comma, write 36 and repeat it forever". Another rational number has also has low complexity, but isn't as easy to write as 7/11 has the rule for it's decimal representation "After zero and comma, write all the telephone numbers in the 1973 Seattle telephone book in the order they are listed and repeat it forever". Again a rational number with low complexity as that above rule wasn't long to write.

    For a random string there cannot be any algorithm that gives all the information of the number with less complexity. A random string can be presented only by itself in it's entirety to have all the information. So how can that be then done? Only a sequence that doesn't repeat itself can be random. Hence it has to self reference itself on the negative: if you start from an arbitrary point, then what came before cannot be repeated later or define what later comes. Hence the negative self reference.

    So basically the question is what are those Gödel numbers that are undecidable? At least part of them are all the numbers that are random and quite unique to themselves.

    This is my brainstorming and the above can be easily wrong or not easy to follow. I do hope some remarks, though.

    As you said in the piece "Likewise, our knowledge of mathematics comes from experience. Axioms are experimental truths; generalizations from observation.", this is the real question here. Are our axioms correct? Or are we mission some "self evident" things in mathematics. In my view we obviously are: there simply is a realm of mathematics that is uncomputable, where easy deductive proofs aren't possible. Would thus there be an axiom for this?

    I think there could be one.
  • Gnomon
    3.8k
    Did Shannon ever write or say anything about 'mental information'? And have you read the origin of how Shannon came to start using the term 'entropy' in relation to information, on the prompting of Von Neumann, who was a peer, and who said one of the advantages of using the term is because he would always win in arguments if he used it, because 'nobody knows what it means'?Wayfarer
    No, Shannon was not concerned with the metaphysical aspects of Information. He was focused on physically communicated Data, not metaphysically (semiotics, metaphors, analogies) communicated Meaning. In my post I used the "mental" term to distinguish Metaphysical from Physical forms of information. In my Enformationism thesis, I refer to Generic Information (universal power to transform) as a "Shapeshifter", taking-on many physical and metaphysical forms in the world. That notion is based, in part, on Einstein's E=MC^2 formula. In my view, Causal Energy is merely one of many forms of Generic Information (EnFormAction).

    Yes, I'm aware of the "entropy" backstory. That abstruse term may be responsible for the common mis-understanding that Information is essentially random. It is instead, the Order within Chaos ; the Surprise within Randomness ; the Relevant bits within the mass of Irrelevant bytes. :smile:


    von Neumann Versus Shannon Entropy :What is the difference between entropy and Shannon entropy? :
    The intuition for entropy is that it is the average number of bits required to represent or transmit an event drawn from the probability distribution for the random variable. … the Shannon entropy of a distribution is the expected amount of information in an event drawn from that distribution.
    https://machinelearningmastery.com/what-is-information-entropy/
  • Gnomon
    3.8k
    I'm reading his Demon in the Machine at the moment, and I've been reading Deacon. But I'm still dubious that 'information' has fundamental explanatory power - because it's not a metaphysical simple. The term 'information' is polysemic - it has many meanings, depending on its context.Wayfarer
    Your ambiguity (uncertainty) about Deacon's novel notion of Information as Absence is understandable, because Shannon defined his kind of Information*1 as the presence of detectable data. The essence of his statistical definition is the Probability ratio of 0% (nothing) to 100% (something) : 0/1 or 1/0, and everything in between. So, information is like a quantum particle in that a Bit exists only as a Possibility until measured (understood). Moreover, several quantum theorists concluded that Probability (not yet real) was converted into Certainty by the evaluation of an inquiring mind. That sounds magical & mystical, but the scientists' intentions were pragmatic*2.

    Deacon saw what others missed in the statistical nature of Information : Probability is nothing until Actualized somehow. But that nothingness (absence) is the metaphysical power behind all Change (causation) in the world. He referred to that invisible power as "Constitutive Absence"*3, which is the capability (force + control) to construct something from scratch. The most familiar causal power in science is labeled "Energy", and defined as the ability to do useful work. Yet the substance of that power is left undefined, because it is not a material object, but more like a metaphysical force, which is knowable only after it has done its work and moved on.

    In my personal information-based thesis, I merged several of those polysemantic applications of "Information" into a single "metaphysical simple"*4 : the power to transform. Which I labeled EnFormAction to make it signify Energy + Form + Action. I'm sure that Deacon has never heard of that term, and he may not think of his Constitutive Absence as a metaphysical concept. But I think it can be used to narrow down the various meanings of Information to something like a philosophical Atom, brimming with potential causal power. :smile:


    *1. What is Shannon information? :
    Although the use of the word ‘information’, with different meanings, can be traced back to
    antique and medieval texts (see Adriaans 2013), it is only in the 20th century that the term begins
    to acquire the present-day sense. Nevertheless, the pervasiveness of the notion of information
    both in our everyday life and in our scientific practice does not imply the agreement about the
    content of the concept. As Luciano Floridi (2010, 2011) stresses, it is a polysemantic concept associated with different phenomena, such as communication, knowledge, reference, meaning,
    truth, etc.

    https://philsci-archive.pitt.edu/10911/1/What_is_Shannon_Information.pdf

    *2. Information, What is It? :
    Deacon addresses many of those self-referencing feedback-loop mind-bogglers in his book. But perhaps the most fundamental enigma is the ultimate “nature” of Information itself. The original usage of the term was primarily Functional, as the content of memory & meaning. Then Shannon turned his attention to the Physical aspects of data transmission. Now, Deacon has returned to the most puzzling aspect of mental function : Intentions & Actions. For example : a> how one person’s mind can convey meaning & intentions to another mind; b> how a subjective intention (Will) can result in physical changes to the objective world? How can invisible intangible immaterial (absent) ideas cause physical things to move & transform. Occultists have imagined Mind as a kind of mystical energy or life-force (Chi; psychokinesis) that can be directed outward into the world, like a laser beam, to affect people and objects. But Deacon is not interested in such fictional fantasies. Instead, he tries to walk a fine line between pragmatics & magic, or physics & metaphysics.
    http://bothandblog4.enformationism.info/page26.html {click here}
    Note --- Jesus told his disciples that Faith can move mountains. But his brother James explained "“Show me your faith without the works, and I will show you my faith by my works.” Today, if you want to move a mountain, it helps to make use of dynamite and earth-moving equipment : pragmatic faith.


    *3. Constitutive : having the power to establish or give organized existence to something.
    ___Oxford Languages

    *4. Metaphysical Simple : an immaterial atom ; a non-physical element ; the fundamental constituent of a complex structure
  • Wayfarer
    22.6k
    Hey, I've quoted your response to me in the Abiogenesis thread to here, as it's more relevant to this topic, and as we're discussing Terrence Deacon in both threads.

    It occurs to me that maybe you could say that Deacon is trying to establish the linkage between physical and logical causation. Ran it by ChatGPT, it says that it's feasible.
    — Wayfarer

    I had to Google "logical causation". What I found was not very enlightening*1.

    Apparently, Logical Causation is what Hume said was "unprovable"*2, perhaps in the sense that a logical relationship (this ergo that) is not as objectively true as an empirical (this always follows that) demonstration. Logic can imply causation in an ideal (subjective) sense, but only physics can prove it in a real (objective) sense.

    Of course, even physical "proof" is derived from limited examples. So any generalization of the proven "fact" is a logical extrapolation (subjective) from Few to All, that may or may not be true in ultimate reality. I suppose It comes down to the definitional difference between Ideal (what ought to be) and Real (what is) causation. How is linking the two realms (subjective logic and objective science) "feasible"? Isn't that where skeptics confidently challenge presumably rational conclusions with "show me the evidence"?

    Do you think Deacon's "constitutive absence"*3 is the missing link between Logical truth and Empirical fact*4 regarding Abiogenesis? I'm afraid that proving a definite connection is above my pay grade, as an untrained amateur philosopher. What is ChatGPT's philosophical qualification? :grin:

    *1. What is the difference between logical implication and physical causality? :
    Logical implication refers to the relationship between two statements where the truth of one statement guarantees the truth of the other. Physical causality, on the other hand, refers to the relationship between events where one event is the direct cause of another event.
    https://www.physicsforums.com/threads/logical-implication-vs-physical-causality.1015629/

    *2. Hume Causation :
    Hume saw causation as a relationship between two impressions or ideas in the mind. He argued that because causation is defined by experience, any cause-and-effect relationship could be incorrect because thoughts are subjective and therefore causality cannot be proven.
    https://study.com/academy/lesson/the-metaphysics-of-causation-humes-theory.html

    *3. Causation by Constitutive Absence :
    According to Deacon, the defining property of every living or psychic system is that its causes are conspicuously absent from the system
    https://footnotes2plato.com/2012/05/23/reading-incomplete-nature-by-terrence-deacon/

    *4. Causal and Constitutive explanation :
    It is quite natural to explain differences or changes in causal capacities by referring to an absence of certain components or to their malfunction. . . . .
    most philosophers of explanation recognize that there is an important class of non-causal explanations, although it has received much less attention. These explanations are conventionally called constitutive explanations
    file:///C:/Users/johne/Downloads/Causal_and_constitutive_explanation_comp.pdf
    Gnomon
  • Wayfarer
    22.6k
    Do you think Deacon's "constitutive absence"*3 is the missing link between Logical truth and Empirical fact*4 regarding Abiogenesis?Gnomon

    The relationship of logical necessity and physical causation is a deep topic and one of interest to me. It is fundamental to Hume’s ‘Treatise’ where he argues that deductive proofs are true as a matter of definition, whilst facts derived from observation have no such necessity. It’s the distinction between observed facts (a posteriori, known by experience) and deductions (a priori, matters of definition), pretty much in line with your point 2 above.

    But then, Kant believed that his ‘answer to Hume’ dealt with the issue with the concept of facts that are 'synthetic a priori' (definition).

    I suppose It comes down to the definitional difference between Ideal (what ought to be) and Real (what is) causation.Gnomon

    I think where it shows up is in Wigner's unreasonable effectiveness of mathematics in the natural sciences. Look at the way that physics applies mathematical logic to physical objects and forces. It is fundamental to modern scientific method. This is, of course, a dense metaphysical question, and generally speaking modern philosophy is averse to metaphysics.

    See my earler thread Logical Necessity and Physical Causation for a discussion.

    My point regarding Terrence Deacon in particular is that it appears to me that he's attempting to bridge this gap by explaining how it is that symbolic logic can arise as both a consequence and cause in the physical order of things.
  • Gnomon
    3.8k
    The relationship of logical necessity and physical causation is a deep topic and one of interest to me. . . . .
    I think where it shows up is in Wigner's unreasonable effectiveness of mathematics in the natural sciences.
    Wayfarer
    I have come to think of human-constructed Mathematics as our synthetic imitation of the natural Logic of the universe. By that I mean, chemistry/physics is an expression of fundamental Logic in the substance of Matter (functional organization) and the action of Energy ("physical causation"). Another way to put it is that "all Math is Geometry", where we can extend the geo-centric label to include all causal & formal inter-relationships in the entire Cosmos.

    If so, then the "effectiveness" of Math, in scientific endeavors, is an indication that we humans have correctly interpreted the Logic of the universe --- "constraining affordances" --- as natural laws and mathematical ratios. Hence, our artificial "designs" (e.g. computers) are workable as information processors, even though they may not yet be literally "semantic engines" (apologies to ChatGPT). :smile:


    The Logic of Information: A Theory of Philosophy as Conceptual Design
    by Luciano Floridi
    This is a book on the logic of design and hence on how we make, transform, refine, and improve the objects of our knowledge. The starting point is that reality provides the data, to be understood as constraining affordances, and we transform them into information, like semantic engines.
    https://academic.oup.com/book/27824

    Randomness & Information : inverse logical/mathematical relationship
    In a statistical mechanics book, it is stated that "randomness and information are essentially the same thing," which results from the fact that a random process requires high information. . . . .
    But, later it says that entropy and information are inversely related since disorder decreases with knowledge. But, this does not make sense to me. I always thought that entropy and randomness in a system were the same thing.

    https://physics.stackexchange.com/questions/447465/understanding-entropy-information-and-randomness
    Entropy & randomness are directly related; but randomness & semantics (meaning ; useful information) are inversely related.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.