• 180 Proof
    15.3k
    I see no reason to do that. Either succinctly express your disagreement with something I have written that you wish for me to further elaborate on or agree to disagree and leave it there.
  • ucarr
    1.5k


    If you're refusing to read Tarskian's posts linking to:

    "Entropy, heat, and Gödel incompleteness", 2014, by Karl-Georg Schlesinger,Tarskian

    it's not obvious to me why you see no reason to refute Schlesinger. The title of his paper makes it clear he's worked on the question of a causal link connecting entropy and Gödel incompleteness, the very focus of my question to you.

    ...is there a logically sound argument claiming there is a causal relationship between entropy and incompleteness?ucarr

    No.180 Proof

    ...succinctly express your disagreement with something I have written that you wish for me to further elaborate on...180 Proof

    How does this differ from what I've asked of you?
  • Tarskian
    658
    Interesting observation. I'm not sure it takes G-incompleteness to reach this point.jgill

    I think that the connection Schlesinger sees, is tied more directly to Chaitin's incompleteness theorem than to Godel's theorem. (But then again, Godel is provable from Chaitin.)

    We can view Peano arithmetic theory as a compressed image of arithmetical reality. Proving from PA amounts to decompressing some information of PA's reality out of its theory.

    But then again, there is a serious mismatch in the amount of information between the compressed and uncompressed states of PA's reality.

    The compressed state (its theory) contains only a small fraction of the total amount of information in the uncompressed state (its reality).

    Our problem, however, is that we cannot see directly the decompressed state of PA, i.e. its reality. The only way to see some of it is by decompressing it.

    This decompression mechanism can easily mislead us.

    That is why, until Godel's 1931 publication, the positivists even insisted that we could decompress the totality of arithmetical reality from its compressed state (its theory). They really believed it.

    Positivism (and scientism) therefore amount to the misguided belief that PA (or science) is a lossless compression of arithmetical (or physical) reality.

    I think that Schlesinger seems to make sense.

    If the forward direction of a phenomenon incorporates information that cannot be decompressed from its theory, then it will also be impossible to decompress the information needed to reverse it, rendering the phenomenon irreversible.

    The only problem I have, is that this view makes the details of the compression algorithm (the underlying theory) a bit too fundamental to my taste.
  • 180 Proof
    15.3k
    You have not given me any reason to read someone else's thoughts on the matter. Make your philosoophical case, ucarr, and I will respond.
  • apokrisis
    7.3k
    If the forward direction of a phenomenon incorporates information that cannot be decompressed from its theory, then it will also be impossible to decompress the information needed to reverse it, rendering the phenomenon irreversible.

    The only problem I have, is that this view makes the details of the compression algorithm (the underlying theory) a bit too fundamental to my taste.
    Tarskian

    This is why natural philosophy also recognises accidents or spontaneity in its metaphysics.

    The ball perfectly poised on Norton’s dome can never start to roll down the slope if we were to believe only in Newton’s algorithmic description.

    But the dynamicist will say that in a poised system, any fluctuation at all is going to break the symmetry spontaneously. There is always going to be some vibration. Any vibration. We can call that a determining factor but really it is just the inevitability of there being an accident. The accidental can’t be in fact removed from the world, even if that is not what axiomatic determinism wants us to believe.
  • Tarskian
    658
    Is there any literature that examines questions about the relationship between Heisenberg Uncertainty and Gödel Incompleteness?ucarr

    Yes.

    Calude & Stay, 2004, "From Heisenberg to Gödel via Chaitin."

    https://link.springer.com/article/10.1007/s10773-006-9296-8#preview

    In 1927 Heisenberg discovered that the “more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa.” Four years later Gödel showed that a finitely specified, consistent formal system which is large enough to include arithmetic is incomplete. As both results express some kind of impossibility it is natural to ask whether there is any relation between them, and, indeed, this question has been repeatedly asked for a long time. The main interest seems to have been in possible implications of incompleteness to physics. In this note we will take interest in the converse implication and will offer a positive answer to the question: Does uncertainty imply incompleteness? We will show that algorithmic randomness is equivalent to a “formal uncertainty principle” which implies Chaitin’s information-theoretic incompleteness. We also show that the derived uncertainty relation, for many computers, is physical. In fact, the formal uncertainty principle applies to all systems governed by the wave equation, not just quantum waves. This fact supports the conjecture that uncertainty implies algorithmic randomness not only in mathematics, but also in physics.

    Just like in Schlesinger paper, Calude & Stay switched from Gödel's incompleteness to Chaitin's incompleteness. Gödel is provable from Chaitin. However, Chaitin seems to possess better explanatory power when dealing with entropy or fundamental uncertainty.

    So, if a theory is the compressed image of a particular (uncompressed) reality, there is potentially a mismatch between the amount of information contained in the theory versus the amount of information contained in its reality. A theory (capable of arithmetic) contains substantially less compressed information than the uncompressed reality that it describes. This theory is necessarily a lossy compression.

    Proving from theory is equivalent to decompressing some information about its reality out of its compressed theory.

    Heisenberg discovered that it is not possible to simultaneously "decompress" precise position and precise momentum information for a particle. So, it looks like Chaitin's incompleteness all over again.

    I do not see this phenomenon as "randomness", though.

    There is absolutely nothing random about arithmetic theory or arithmetical reality. Natural-number arithmetic is a completely deterministic system. Its arithmetical reality is indeed largely unpredictable but it is not random at all.

    Similarly, there is absolutely no need for the physical universe to be random, for it to be largely unpredictable. It could be, but it does not have to be.
  • Tarskian
    658
    The accidental can’t be in fact removed from the world, even if that is not what axiomatic determinism wants us to believe.apokrisis

    Positivism and scientism incorrectly claim this.

    Axiomatic determinism does not claim this.

    Even not at all.

    On the contrary, axiomatic determinism fully acknowledges that an axiomatic system (capable of arithmetic) is at best a lossy compression of the reality that it describes. This lossy compression necessarily forgets most of the information contained in the uncompressed reality.

    It is simply not possible to decompress and reconstruct the totality of all the information about reality out of an axiomatic system that describes it (if this axiomatic system is capable of arithmetic). That is exactly what Chaitin (and Gödel) prove about such systems.

    But then again, it also does not mean that the information forgotten in the compression is "accidental" or "random". It does not even need to be. There is nothing random about arithmetical reality, while it is still full of unpredictable facts.

    Randomness is not a necessary requirement for unpredictability. Incompleteness alone is already sufficient. A completely deterministic system can still be mostly unpredictable.
  • apokrisis
    7.3k
    Similarly, there is absolutely no need for the physical universe to be random, for it to be largely unpredictable. It could be, but it does not have to be.Tarskian

    It makes more sense to see randomness and determinism as the complimentary limits on being. Each limit can be extremitised, but only in the effective sense, not in an absolutist sense.

    Incompleteness raises much angst in the determinist. But It only takes an infinitesimal grain of chance to complete things.
  • apokrisis
    7.3k
    But then again, it also does not mean that the information forgotten in the compression is "accidental" or "random". It does not even need to be. There is nothing random about arithmetical reality, while it is still full of unpredictable facts.Tarskian

    The picture I have in mind goes beyond just a lossy compression - although that is a way to view it. In the hierarchy theory view, the determined and the random become the global constraints and the local freedoms. The point of this difference is that the freedoms rebuild the constraints. They are the two sides of the one whole and hence have a holistic completeness.

    Steven Frank wrote this nice paper which indeed argues your point that it doesn’t matter if the fine grain is considered to be deterministic or random. What matters is that microstates can be described by macroscopic constraints as they are freedoms that can’t help but rebuild their global equilibrium.
  • Tarskian
    658
    But It only takes an infinitesimal grain of chance to complete things.apokrisis

    When a compression algorithm forgets particular facts, it says much more about this algorithm than about these facts. Another algorithm may even include them. The term "chance" points to facts forgotten by the compression algorithm. I do not see that qualification as particularly fundamental.
  • apokrisis
    7.3k
    When a compression algorithm forgets particular facts,Tarskian

    Now you are making points about variety in types of compression algorithms, not about general principles.
  • Tarskian
    658
    Now you are making points about variety in types of compression algorithms, not about general principles.apokrisis

    That is actually also what Steven Frank does in his paper "The common patterns of nature". He argues that the algorithm -- in this case, the statistical distribution -- that maximizes entropy will dominate particular situations.

    A statistical distribution compresses the information about a sample into just a few parameters. It is lossy. It will generally not succeed in decompressing these few parameters back into the full sample.

    About the general principles, he writes:

    https://arxiv.org/pdf/0906.3507

    In particular, I use Jaynes’ maximum entropy approach to unify the relations between aggregation and pattern (Jaynes, 2003). Information plays the key role. In each problem, ultimate pattern arises from the particular information preserved in the face of the combined fluctuations in aggregates that decay all non-preserved aspects of pattern toward maximum entropy or maximum randomness.

    Axiomatic theories do something similar.

    The few rules in the axiomatic theory will not succeed in decompressing themselves back into the full reality. What facts from the full reality that they fail to incorporate does not say particularly much about these facts (deemed "chance", "random", ...). They rather say something about the compression technique being used, which is the principle that chooses what facts will be deemed predictable and what facts will be deemed mere "chance".
  • I like sushi
    4.8k
    @ucarr Was this correct:

    I am starting to believe that what you are really getting at behind the curtains here is that science and art share common features.I like sushi

    Followed by the possibility of uniting/transcending the differences held by many?

    I simple yes/no or suffice. If it is a bit more than this then a sketchy - yet straight forward - outline would be all I need.

    Thanks
  • ucarr
    1.5k


    You have not given me any reason to read someone else's thoughts on the matter. Make your philosoophical case, ucarr, and I will respond.180 Proof

    ...is there a logically sound argument claiming there is a causal relationship between entropy and incompleteness?ucarr

    No.180 Proof

    The issue I want you to focus upon is this: for any system that does work, as it goes forward in the systematic process of doing work, the work builds up complexity of detail. This building up of complexity can be observed in two modes: phenomenal (entropy) and epistemic (logic).

    Gödel and Chaitin have shown in the epistemic mode that the full scope of the evolving complexity can not be formally tied to the ground from which it emerges. This leads to the conclusion that axiomatic systems are a form of compression of complexity and that the increase of complexity is an irreversible process.

    If the forward direction of a phenomenon incorporates information that cannot be decompressed from its theory, then it will also be impossible to decompress the information needed to reverse it, rendering the phenomenon irreversible.Tarskian

    Here's a critical question: Is it true that the extrapolation from an axiomatic system to complexity irreversible to the axiomatic system cannot be certified, and thus axiomatic systems are both incomplete and uncertain?

    If you think the answer to this question is "no," can you succinctly demonstrate your refutation?
  • ssu
    8.5k
    Let me attempt to clarity: I'm attempting say I can't enact the negation of what I'm doing. →

    Anything I write will not be something I do not write.
    ucarr
    You understand it perfectly clear. The basic issue here is the negative self-reference. And that issue is similar in Gödel's incompleteness theorem and Turings result (on the Entscheidungsproblem).

    Please note that what I'm referring is that this doesn't mean that there's sentences that cannot write, let's say that you cannot refer something from "War and Peace". You can naturally refer some text from "War and Peace". There's no limitation on just what you next will write. Yet there's always those sentences you don't write, it's simply not fixed what these sentences are. Naturally everything what you write also defines all the sentences that you don't write. Hope you get my point.

    This sounds perhaps trivial, but I think some people don't understand the meaning when we say that a Turing Machine cannot compute something. They immediately start assuming that some "Oracle Machine" or a "Busy Beaver" that could overcome the "limitation" of a Turing Machine and then make further assumptions what would this imply, with just assuming that the limitation is somehow overcome.
  • 180 Proof
    15.3k
    ... for any system that does work, as it goes forward in the systematic process of doing work, the work builds up complexity of detail. This building up of complexity can be observed in two modes: phenomenal (entropy) and epistemic (logic).ucarr
    Stop. This confuses empiricism with formalism – nonsense (i.e. logic is not "doing work").

    This leads to the conclusion that axiomatic systems are a form of compression of complexity and that the increase of complexity is an irreversible process.
    More nonsense. Formalisms (axiomatic or otherwise) are abstract and therefore do not refer beyond themselves to concrete matters of fact (e.g. entropy), rather they are used as syntax for methods of precisely measuring / describing the regularities of nature. That syntax is fundamentally incomplete / undecidable (re: Gödel / Chaitin) says nothing about nature, only about the (apparently) limits of (our) rationality. In other words, that physical laws are computable does not entail that the physical universe is a computer.
  • apokrisis
    7.3k
    The few rules in the axiomatic theory will not succeed in decompressing themselves back into the full reality. What facts from the full reality that they fail to incorporate does not say particularly much about these facts (deemed "chance", "random", ...). They rather say something about the compression technique being used, which is the principle that chooses what facts will be deemed predictable and what facts will be deemed mere "chance".Tarskian

    I was targeting a deeper point about the reversibility of mechanics and the irreversibility of nature.

    Mechanics seeks time reversible descriptions of nature. It seems to succeed which then makes the thermodynamic arrow of time a fundamental problem. So how to fix that?

    The point I would make is that lossy compression is just a mechanical sieving that involves literally throwing information away. So the claim is the information did exist, it has merely been discarded and that is how any irreversibility arises.

    But the other approach is says rather than actuality being discarded, the story is about possibilities getting created. As the past is being fixed as what is now actual, future possibilities explode in number.

    This is what chaos theory gets at. Standard three body problem stuff. The current state of the system can only give you so much concrete information to make your future forecast. Time symmetry is broken by indeterminancy at its start rather than by information discard by its end.

    Basically your efforts at future prediction execute in polynomial time but your errors at each step accumulate in exponential time.

    Aaronson did a nice article – Why Philosophers Should Care About Computational Complexity

    Quanta also – Complexity Theory’s 50-Year Journey to the Limits of Knowledge

    But then after arriving at a proper model of chaos, one can continue on to a larger story of order out of chaos – or the topological order of dissipative structure. The idea of the mechanical sieve and its lossy compression comes back in over that foundational chaos in the form of evolution or a Darwinian selection filter.

    If a system has some kind of memory, this starts to select for possibilities that coordinate. Sand being blown in the Saharan wind can start to accumulate as a now a larger structure of slowly shifting dunes. A lid gets put on random variety and so out of smaller scale chaos, or degrees of freedom, grows larger scale order, or a context of variety-taming constraints.

    So the holistic picture speaks to irreversible mechanics as something rather like ... exploding quantum wavefunction indeterminancy and constraining quantum thermal decoherence.

    You get the complete causal story by being able to point to the fundamentally random, and even chaotic, scale of being that then got topologically tamed by its own higher scale dynamics. A lossy algorithm is what developed over time due to natural selection. A mechanics is what emerged.
  • Tarskian
    658
    Is it true that the extrapolation from an axiomatic system to complexity irreversible to the axiomatic system cannot be certified, and thus axiomatic systems are both incomplete and uncertain?ucarr

    Uncertainty is a precision problem.

    More precision means more information.

    According to Chaitin's incompleteness, sufficiently higher precision will indeed at some point exceed the amount of information that the system can decompress.

    According to the literature on the subject, both incompleteness and imprecision ("uncertainty") can be explained by the principle of lossy compression that results in a particular maximum amount of information that could ever be decompressed out of the system.

    This problem becomes very apparent when trying to reverse a particular physical process. The amount of information required to do that may simply not be available, leading to the process becoming irreversible.

    For example, an explosion. Can it be reversed? The problem is that the process would need to store an inordinate amount of information to even attempt that. This is a necessary (but not a sufficient) condition for the process to be reversible. Since even just the information to reverse the process is not remembered anywhere by the process, any attempt at reversing it would simply fail.
  • ucarr
    1.5k


    It is simply not possible to decompress and reconstruct the totality of all the information about reality out of an axiomatic system that describes it (if this axiomatic system is capable of arithmetic).Tarskian

    But then again, it also does not mean that the information forgotten in the compression is "accidental" or "random".Tarskian

    Randomness is not a necessary requirement for unpredictability. Incompleteness alone is already sufficient. A completely deterministic system can still be mostly unpredictable.Tarskian

    From your sequence of quotes here, I understand that, just as you say "Incompleteness alone is already sufficient." [to cause unpredictability].

    Can we generalize to the following claim: our material creation, as we currently understand it, supports: the determinism of axiomatic systems, the incompleteness of irreversible complexity and the uncertainty of evolving dynamical systems, and, moreover, this triad of attributes is fundamental, not conditional?
  • ucarr
    1.5k


    In each problem, ultimate pattern arises from the particular information preserved in the face of the combined fluctuations in aggregates that decay all non-preserved aspects of pattern toward maximum entropy or maximum randomnessTarskian

    Axiomatic theories do something similar.Tarskian

    The few rules in the axiomatic theory will not succeed in decompressing themselves back into the full reality. What facts from the full reality that they fail to incorporate does not say particularly much about these facts (deemed "chance", "random", ...). They rather say something about the compression technique being used, which is the principle that chooses what facts will be deemed predictable and what facts will be deemed mere "chance".Tarskian

    As I understand it, an axiomatic system is a compressor. The algorithm that generates the axiomatic system has a focal point that excludes info inconsequential to the outcome the axiomatic system tries to predict.

    Does a lossy axiomatic system also necessarily omit consequential facts because of measurement limitations described by Heisenberg Uncertainty?
  • Tarskian
    658
    As I understand it, an axiomatic system is a compressor.ucarr

    The axiomatic system is indeed a compression (Chaitin) but we just axiomitize it without using any known compression algorithm.

    The axiomatization is actually discovered by human ingenuity without any further justification.

    By proving from it, however, we decompress information out of the axiomatic system . So, it is the proving from it that constitutes the decompression algorithm.

    According to the Curry-Howard correspondence, a proof is indeed a program, and therefore, an algorithm. Because every proof is potentially different, the decompression algorithm is actually a collection of algorithms, usually, each painstakingly discovered.

    So, the compression algorithm is unknown but some part of the decompression algorithm is discovered each time we successfully prove from the system.

    The algorithm that generates the axiomatic system has a focal point that excludes info inconsequential to the outcome the axiomatic system tries to predict.ucarr

    Yes, the compression result excludes information. This excluded information may be inconsequential but it may also lead to a substantial reduction in desired predictive power. It forgets facts in the reality that it describes. This may or may not be a problem for the application at hand.

    The axiomatic system is the result of a compression but we don't know what algorithm led to this result. It is discovered simply by human ingenuity.

    Does a lossy axiomatic system also necessarily omit consequential facts because of measurement limitations described by Heisenberg Uncertainty?ucarr

    Yes. Technically, the resulting imprecision is the due to the fundamental properties of wave functions.

    However, the paper mentioned , Calude & Stay, 2004, "From Heisenberg to Gödel via Chaitin.", connects uncertainty to Chaitin's incompleteness:

    In fact, the formal uncertainty principle applies to all systems governed by the wave equation, not just quantum waves. This fact supports the conjecture that uncertainty implies algorithmic randomness not only in mathematics, but also in physics.

    They conclude that it is not possible to decompress more precise information out of an axiomatic system than the maximum precision imposed by the fundamental properties of wave functions.
  • ucarr
    1.5k


    Was this correct:

    I am starting to believe that what you are really getting at behind the curtains here is that science and art share common features.
    I like sushi

    Followed by the possibility of uniting/transcending the differences held by many?I like sushi

    I simple yes/no or suffice. If it is a bit more than this then a sketchy - yet straight forward - outline would be all I need.I like sushi

    We don't live within a universe; instead, we live within a vital approach to a universe strategically forestalled by entropy_uncertainty_incompleteness. Science and Humanities are the two great modes of experiencing the uncontainable vitality.ucarr

    This is how I talk about science and humanities in broad generality. The link below will take you to the post for additional context.

    ucarr post

    That they overlap in ways complex and nuanced I acknowledge. Their common ground has not been my focus in this conversation. What I haven't seen (I'm not implying such literature doesn't exist) is a general description of how they differ. Through both lenses: similarity and difference, the view of the comparison is complex and nuanced.

    I feel a measure of satisfaction with my "What" Vs "How" binary. Again, this binary entails a complex and nuanced interweave of both "What" and "How." A loose translation into English might be: What is meets What it's like to experience what is.

    This language points toward The Hard Problem. Looking objectively at subjectivity is hard to do. Is consciousness purely subjective? "Not exactly," says science when it attempts to detach the observer from the observed. QM tells us there is no purity of observational detachment.

    QM entanglement tells us something about consciousness: it interweaves the objective and subjective. Does this dovetail with the holism you see?
  • Tarskian
    658
    Can we generalize to the following claim: our material creation, as we currently understand it, supports: the determinism of axiomatic systems, the incompleteness of irreversible complexity and the uncertainty of evolving dynamical systems, and, moreover, this triad of attributes is fundamental, not conditional?ucarr

    Yes, informational incompleteness (Chaitin) and uncertainty (Heisenberg) are deemed.directly related (Calude & Stay, 2004).

    The irreversibility of particular physical processes (entropy) is also deemed directly related (Schlesinger 2014) to informational incompleteness (Chaitin).

    Furthermore, arithmetical incompleteness (Godel) is provable (Zisselman 2023) from informational uncertainty (Chaitin).

    However, only some part of all the above is effectively provable.
  • ucarr
    1.5k


    ... for any system that does work, as it goes forward in the systematic process of doing work, the work builds up complexity of detail. This building up of complexity can be observed in two modes: phenomenal (entropy) and epistemic (logic).ucarr

    ...logic is not "doing work"180 Proof

    What is symbolic logic without the reader? It's marks on paper. No work. Really, it's non-existent without the writer. Logic that has meaning and works always assumes the interaction between a human and the marks on the paper: Aristotle's intelligent agent meets intelligibility. Work.

    This leads to the conclusion that axiomatic systems are a form of compression of complexity and that the increase of complexity is an irreversible process.

    More nonsense. Formalisms (axiomatic or otherwise) are abstract and therefore do not refer beyond themselves to concrete matters of fact (e.g. entropy), rather they are used as syntax for methods of precisely measuring / describing the regularities of nature.180 Proof

    "Formalisms that do not refer beyond themselves to concrete matters of fact (e.g. entropy)..." no work.

    However, as above: Formalisms that have meaning and work always assume the interaction between a human and the marks on the paper: Aristotle's intelligent agent meets intelligibility. Work.

    Formalisms are not abstract because, in their description of nature, they express the state of being (in this case: thinking) of human individuals who are, indeed, a part of nature, and therefore, human expression is nature expressing herself. There is no discrete bifurcation separating "abstract" human thought from nature.

    "... rather they are used as syntax for methods of precisely measuring / describing the regularities of nature."

    Why bother with measurement and description if there's no existential connection between abstract thought and nature? If formalisms are hermetically sealed off from nature, then, willy-nilly you can assign whatever meaning you like to whatever marks on paper you make, for all the value that has.

    Both life and science are interesting. This because, within the realm of human thinking -- just another natural thing -- it's possible to be either right or wrong. Right and wrong draw their force and value from the interweave of nature as thinking and nature as object of thinking, i.e., nature looking at herself.

    The test of syntax comes down to: can you speak the words trippingly from the tongue? The test of science comes down to: can you observe the predictions in nature? These tests bespeak the interweave between natural thinking and nature thinking about herself.

    We will show that algorithmic randomness is equivalent to a “formal uncertainty principle” which implies Chaitin’s information-theoretic incompleteness. We also show that the derived uncertainty relation, for many computers, is physical. In fact, the formal uncertainty principle applies to all systems governed by the wave equation, not just quantum waves. This fact supports the conjecture that uncertainty implies algorithmic randomness not only in mathematics, but also in physics.Tarskian quoting Calude and Stay

    Calude & Stay, 2004, "From Heisenberg to Gödel via Chaitin."

    The above quoted theories are interesting because they could either be right or wrong. There's something at stake. That wouldn't be the cause if human thinking weren't existentially connected to the natural world surrounding it.
  • ucarr
    1.5k


    Does a lossy axiomatic system also necessarily omit consequential facts because of measurement limitations described by Heisenberg Uncertainty?ucarr

    Yes. Technically, the resulting imprecision is the due to the fundamental properties of wave functions.Tarskian

    Do you have any interest in the Beckenstein bound, from the Holographic Principle (Gerard t'Hooft)? It describes a limit to the amount of information that can be stored within an area of spacetime at the Planck scale. Among other things, this limit establishes the physical nature of information. There's an algorithm for measuring the Beckenstein bound: it's a fraction of the area of the event horizon of a black hole.
  • Tarskian
    658
    Do you have any interest in the Beckenstein bound, from the Holographic Principle (Gerard t'Hooft)? It describes a limit to the amount of information that can be stored within an area of spacetime at the Planck scale. Among other things, this limit establishes the physical nature of information. There's an algorithm for measuring the Beckenstein bound: it's a fraction of the area of the event horizon of a black hole.ucarr

    Interesting.

    https://en.wikipedia.org/wiki/Bekenstein_bound

    In physics, the Bekenstein bound (named after Jacob Bekenstein) is an upper limit on the thermodynamic entropy S, or Shannon entropy H, that can be contained within a given finite region of space which has a finite amount of energy—or conversely, the maximum amount of information required to perfectly describe a given physical system down to the quantum level.
  • 180 Proof
    15.3k
    Logic that has meaning and works always assumes the interaction between a human and the marks on the paper:ucarr
    So what? In the context of my replies to your last few posts, that's another non sequitur.

    :roll: Bad physics breeds bad philosophy.
  • ucarr
    1.5k


    Formalisms (axiomatic or otherwise) are abstract and therefore do not refer beyond themselves to concrete matters of fact (e.g. entropy)...180 Proof

    ...rather they are...measuring / describing the regularities of nature.180 Proof

    Why are these two statements not a contradiction?

    Why are "regularities of nature" not concrete matters of fact?

    How are "matters of fact" concrete but not empirical?

    If self-descriptions ("formalisms...do not refer beyond themselves") have nothing to do with the world (nature), instead being interested only in themselves, how are they meaningful and useful?
  • ucarr
    1.5k


    Uncertainty is a precision problem.

    More precision means more information.

    According to Chaitin's incompleteness, sufficiently higher precision will indeed at some point exceed the amount of information that the system can decompress.

    According to the literature on the subject, both incompleteness and imprecision ("uncertainty") can be explained by the principle of lossy compression that results in a particular maximum amount of information that could ever be decompressed out of the system.
    Tarskian

    Does a lossy axiomatic system also necessarily omit consequential facts because of measurement limitations described by Heisenberg Uncertainty?ucarr

    Yes. Technically, the resulting imprecision is the due to the fundamental properties of wave functions.

    However, the paper mentioned , Calude & Stay, 2004, "From Heisenberg to Gödel via Chaitin.", connects uncertainty to Chaitin's incompleteness:

    In fact, the formal uncertainty principle applies to all systems governed by the wave equation, not just quantum waves. This fact supports the conjecture that uncertainty implies algorithmic randomness not only in mathematics, but also in physics.

    They conclude that it is not possible to decompress more precise information out of an axiomatic system than the maximum precision imposed by the fundamental properties of wave functions.
    Tarskian

    precision | prēˈsiZH(ə)n |
    noun
    technical refinement in a measurement, calculation, or specification, especially as represented by the number of digits given: a precision of six decimal figures
    The Apple Dictionary

    When we look at the triad of entropy_uncertainty_incompleteness through the lens of imprecision, which is about exactness, we see that the informational dimension of nature is not fully containable within human observation, whether of the scientific type, or of the humanities type.

    Does this tell us something about the incompleteness of nature, or does it tell us something about the incompleteness of human cognition?

    Given the limits of measurement and decompression, does 180 Proof have a cogent point?

    ...that physical laws are computable does not entail that the physical universe is a computer.180 Proof

    180 Proof

    Does this argument cast doubt on whether we can know reality beyond its human translation?

    Are the disciplines of epistemology and ontology merely products of human translations?

    Is Platonic Realism correct: humans dwell within a (cognitive) dark cave, sealed off from direct and complete experience of reality? Plato, however, thought he saw a way out of shadowy perception by means of reasoning beyond appearances.

    Can we hope to eventually reason beyond the current state-of-the-art observations limited by imprecision of measurement and incompleteness of decompression? Or is it the case the limited measurements of the wave function and the limited decompression of axiomatic systems reflect existential limitations embedded in nature?

    Now perhaps we come to a crux of the faceoff between the sciences and the humanities. If the observer is always entangled with the observed, does that mean the two great modalities of discovery: the what and the what it’s like of the what are linked by the biconditional operator?

    The biconditional linking sciences and humanities writ large is the biconditional linking nature and sentience.

    Option 1 – If humans can see nature beyond measurement and decompression limitations, then sentience is inevitable because its seeds are embedded existentially.
    -------------------------------------------------------------------------------------------------------------

    Option 2 – If sentience and nature are creatively and strategically incomplete, without biconditional linkage, then existential limitations of knowing and being are always in effect. There’s a gap separating the two, however, the knowing of being, and the being of knowing of being, make a close approach to each other. This close approach, always incomplete, keeps the game of sentience going creatively because the two infinities, although incommensurable, are entangled in an evolving, inexhaustible complexity.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.