• wonderer1
    1.8k
    But the notion of natural selection suggests some kind of universal teleological agency...Gnomon

    Only if one elects to remain ignorant as to what biologists mean by natural selection.
  • Gnomon
    3.6k
    In a separate post, I'll add some comments of my own.Gnomon

    Personal comments on topics of the OP :

    1. Fitness : Darwin's biological criteria for Fitness was limited to living creatures. But this Information-based “fitness” includes all elements of the physical world. And could be extended to cover the meta-physical (Life & Mind) aspects of the world. For living organisms, fitness is Health (literally Wholeness). A broader definition of Fitness is Wholeness or Integrity or Functional Organization.

    2. Selection : To select or choose functional outcomes is contrasted with random or accidental change. A selection is motivated by a future-directed input of Causation.
    To select =
    a> carefully choose as being the best or most suitable.
    b> (of a group of people or things) carefully chosen from a larger number as being the best or most valuable.

    3. Function : A process directed toward some defined goal or purpose. A functional relation is a meaningful connection between information inputs and outputs. What is the function of a non-living or non-thinking thing? This could only apply to evolution if the process is directional, not random.
    Examples : Mind is the function of Brain. Organization & Complexity are functions of Evolution.

    4. Relation : A functional interconnection or bond or alliance with other entities. A complex organism is bonded or merged into an interrelated system by mutual purpose : correlation of direction toward a final state. Single elements are inert, and have no purpose, only action & reaction. Organisms share energy inputs to redirect reactions toward fitness of the system. A whole or integrated or interrelated System has multiple parts that work together toward some goal, beginning with continuation of the system over time.

    5. Information : To Inform is to provide an essential or formative principle or quality to something. ___Oxford Languages

    Formative Principle :
    a> of or relating to formation, development, or growth
    b> the active, determining principle or law defining a thing.


    Qualia : instances of subjective, conscious experience.
  • Gnomon
    3.6k
    But the notion of natural selection suggests some kind of universal teleological agency... — Gnomon
    Only if one elects to remain ignorant as to what biologists mean by natural selection.
    wonderer1
    I'm not a Biologist. And not "ignorant" of the official biological application of "Selection". For philosophical purposes, I'm not bound to that physically focused meaning. See my post above for an alternative philosophical metaphysical definition of cosmic selection, that is not limited to living creatures, as mentioned in the OP articles. :smile:

    Quote from OP :
    # "The team's notion of fitness beyond biology is "really subtle, complex and wonderful," Stuart Kauffman adds".
  • wonderer1
    1.8k
    For philosophical purposes, I'm not bound to that physically focused meaning.Gnomon

    Then to be clear you ought to use distinct terminology. (I suggest "gnatural selection".) You wouldn't want anyone to get the impression that you are talking about the same thing as scientifically informed people are talking about.
  • Wayfarer
    21k
    I'm afraid it all looks like physics envy, allied to loose use of metaphor.unenlightened

    Speaking of metaphor.....

    Only if one elects to remain ignorant as to what biologists mean by natural selection.wonderer1

    Interesting that one of the first mentions of the term in the linked article encloses it in quotations:

    The likelihood of these traits being 'selected' and passed down are determined by many factors.

    In the Origin of Species, Darwin wrote:

    It may be metaphorically said that Natural Selection is daily and hourly scrutinizing, throughout the world, the slightest variations; rejecting those that are bad, and adding up all that are good; silently and insensibly working whenever and wherever opportunity offers, at the improvement of each organic being.” (1876 ed., 68-69)

    (Emphasis added). It's a metaphor, yet at the same time central to the theory. I think this lives on in the popular mind where we speak of the 'wonders' of evolution, as if evolution itself were an agent, when in reality, the only agents in the frame are organisms themselves.
  • wonderer1
    1.8k
    It's a metaphor, yet at the same time central to the theory. I think this lives on in the popular mind where we speak of the 'wonders' of evolution, as if evolution itself were an agent, when in reality, the only agents in the frame are organisms themselves.Wayfarer

    As the Wikipedia I linked says: [My emphasis.]

    Even if the reproductive advantage is very slight, over many generations any advantageous heritable trait becomes dominant in the population. In this way the natural environment of an organism "selects for" traits that confer a reproductive advantage, causing evolutionary change, as Darwin described.[58] This gives the appearance of purpose, but in natural selection there is no intentional choice.[a] Artificial selection is purposive where natural selection is not, though biologists often use teleological language to describe it.

    It is mostly a matter of teleological language being more expedient for conveying things evolutionary. It is much more linguistically cumbersome to discuss natural selection ateliologically. Teleology isn't central to the theory, as the scare quotes around "selects for" indicate.
  • Gnomon
    3.6k
    Then to be clear you ought to use distinct terminology. (I suggest "gnatural selection".) You wouldn't want anyone to get the impression that you are talking about the same thing as scientifically informed people are talking about.wonderer1
    I assume it was Gnomon who you were categorizing as "ignorant" of Science. And the imputation of Agency as non-scientific. Please note that Darwin's Artificial Selection required intentional agents (humans) to make the teleological (I want more of this good stuff) choices that resulted in today's artificial soft sweet corn, instead of the natural hard-kernel starchy maize.

    Also note that the authors, of the article this thread is based on, are professional scientists*1 ; so I must assume that they are "scientifically informed people". But they didn't bother to redefine the word "selection"*2 ; they only broadened its application from the Biology-of-living-things to everything else in the natural world, from Cosmology to Mineralogy. Maybe even Psychology was a product of natural selection. Or do you think Mind Functions were a cosmic accident?

    However, since you feel the need for a new name for the cosmic process, by which Nature evolves novel Functions from older Forms, how about "Universal Selection", or "General Selection", or "Post-Darwinian Natural Selection"*3, as opposed to the "Special Selection" of Darwin's biological application? Are those Universal Laws too philosophical for you?*4 Are "functions" (how things work) too immaterial for your materialistically "informed" taste? :smile:

    PS___The article did not imply an unconventional meaning of the verb "to select". They merely noted that the object of selection was not mere matter, but operational Functions of the various forms of matter.

    FUNCTION Meaning: "one's proper work or purpose; power of acting in a specific proper way," https://www.etymonline.com/word/function

    Artificial selection is an evolutionary process in which humans consciously select for or against particular features in organisms – for example, by choosing which individuals to save seeds from or breed from one generation to the next.
    https://evolution.berkeley.edu/lines-of-evidence/artificial-selection/

    What is agency in biology?
    Agency is defined by Webster's dictionary as “the capacity to act or exert power”, and in robotics and AI research a system that can act in any way in response to environmental stimuli is sometimes considered agential. But in biology, typically something more is demanded.
    https://www.templeton.org/discoveries/agency-in-biology



    Quotes from OP :

    *1. # The research team behind the law, which included philosophers, astrobiologists, a theoretical physicist, a mineralogist and a data scientist, have called it "the law of increasing functional information."

    *2. # The law also says these configurations are selected based on function, and only a few survive.

    *3. # And, some say evolution is strictly about Darwinian natural selection and common descent, Hazen says. But, "I'm talking about diversification and patterning through time" from one stage to the next,

    *4. # "You have a universe that keeps mixing things up and then trying out new possibilities," Hazen says, adding that it encompasses biological evolution, too. Things that work are selected for, he adds. "That works on nonliving worlds, and it works on living worlds. It's just a natural process that seems to be universal."

    MODERN SWEET CORN artificially evolved to suit human taste
    corn-and-teosinte_f.jpg
  • unenlightened
    8.8k
    I think the term 'natural selection' was 'selected' in the first instance to provide a believable alternative to "Chosen by god" or "Designed by God" to indicate that natural processes might function over time to produce results that appear 'intelligently designed'. Ironic that it now offends the religion of science.

    You have a universe that keeps mixing things up and then trying out new possibilities," Hazen says, adding that it encompasses biological evolution, too. Things that work are selected for, he adds. "That works on nonliving worlds, and it works on living worlds. It's just a natural process that seems to be universal." — op

    What I fail to understand at bottom is how this new principle or law or whatever it is is something other than the law of entropy. Energy dissipates, disorder/information increases. this allows that life, or a hurricane can produce temporary order that functions to increase total entropy.

    The confusion I think arises for many through a misunderstanding of information and its relation to complexity. Maximal order is minimal total information. Everything starts simple and gets more complex and order is simplicity. "functional information" ( as opposed to "total information") is just that temporary ordering of the entropic flow that spontaneously arises by chance, because it 'eases' the flow.

    Thought experiment.
    Imagine a flask of gas an isolated system an equal mix of CO2 and O2 separated by an impermeable but insubstantial barrier the whole at standard temperature and pressure. The total information of the system includes the position, velocity and identity of each molecule but the identity information is highly ordered and compressible to " all the molecules on this side are oxygen, and all the molecules on that side are CO2. The magic barrier functions to maintain this order so in effect there are two isolated systems at energy equilibrium.
    Now remove the magic barrier without disturbing the gasses. They will start to diffuse into each other by the random movement of the molecules, until they are completely randomly positioned. This will be the new equilibrium of the now single system, and total information required will have increased because each molecule will have to be identified individually. Total information increases as disorder increases; The information is trivial and meaningless; for human purposes, "the gasses are mixed" is all we care about.

    Edit: Functional information, which is information we care about (aka a difference that makes a difference{to someone}) is information about order which is to say about disequilibrium and therefore exploitable energy. The details of a state of equilibrium are un exploitable and therefore useless.
  • Hanover
    12.2k
    What increases with complexity is the amount of knowledge we have because we can learn observable changes as opposed to having to rely upon theoretical extrapolations. Information though does not increase with complexity. All the information within the system was there during a state of equilibrium. The chaos that resulted from the interaction did not increase the information. It simply increased the amount of knowledge the system contained prior to its revealing that to us.

    An omniscient being would gain no information from the removal of the barrier because he would know from the layout of the molecules exactly what would occur once the barrier is removed. The information contained in the divided state would therefore be no different from the mixed state because the expected result of the mixture would inform from the divided state.

    We learn from dividing an atom the explosion that would follow, but we can also be said to have known some the result prior ot the division. This would seem to be the crux of the intelligent designer's position, which is that impregnated into every simple system is that complexity will emerge, leading some to the conclusion that the result of the interaction was knowable, predictable, and therefore (and this is the questionable part) planned.

    Even assuming indeterminism, I think you're still left with the idea that prior to a chaotic state you have the same complexity as a controlled state, simply because we don't challenge that State A (equilibrium) caused State B (chaos), even in an indeterminate way.

    I just see State A as describing a predictable pattern of variable interaction and State B as not, but both have just as much information.
  • unenlightened
    8.8k
    An omniscient being would gain no informationHanover

    Oxymoronic warning. Please don't make stuff up about entropy (or God) unless you really do understand it.

    Knowing Euclid's definitions and axioms does not entail knowing Pythagorus' theorem even though it 'follows' from them. The information of the theorem has to be 'unfolded' from the axioms by a particular series of steps that are not specified by the axioms themselves. Similarly, the unfolding of physical processes in time produces new information even if that information is predetermined. If you like, existence is the unfolding of God's omniscience.
  • Gnomon
    3.6k
    What I fail to understand at bottom is how this new principle or law or whatever it is is something other than the law of entropy. Energy dissipates, disorder/information increases. this allows that life, or a hurricane can produce temporary order that functions to increase total entropy.unenlightened
    The OP articles didn't mention Entropy specifically, but you may have a good point : to include "energy dissipation" as a necessary investment in evolutionary progress. That's how the "new law" works to transform an older adequate configuration into a novel & durable functional form*1. The "temporary disorder" is the price Nature pays for a step-up in functional value. For example, the disorderly Plasma of the Big Bang was essentially formless, and good for nothing but raw material for conversion into stable particles of elementary matter.

    Energy per se causes Change (novelty), but Change/Novelty per se is not informative unless it produces a persistent function or meaning. Dissipation is not a good thing unless it leaves behind a stable form of organization, which is the payoff for the expenditure of Energy. As Entropy of a system temporarily increases, local organization may permanently increase, but only if the novel form fills a functional need for the environment of the system. Otherwise, the energy would be wasted. :smile:

    # “The Universe generates novel combinations of atoms, molecules, cells, etc. Those combinations that are stable and can go on to engender even more novelty will continue to evolve. This is what makes life the most striking example of evolution, but evolution is everywhere.”

    *1. Entropy (information theory) :
    In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.
    https://en.wikipedia.org/wiki/Entropy_(information_theory)
    Note --- what's "surprising" about the transformation of a prior orderly state to a later organized state is the novelty that we define as meaningful or transformational Information : the difference that makes a familiar thing into an unfamiliar thing, or to force the observer to see an old thing from a new perspective.


    Edit: Functional information, which is information we care about (aka a difference that makes a difference{to someone}) is information about order which is to say about disequilibrium and therefore exploitable energy. The details of a state of equilibrium are un exploitable and therefore useless.unenlightened
    Yes! "Functional Information" may make a difference to the material Universe, in terms of advancing the physical evolution of the Cosmic System. But, in order to be meaningful (something we "care about"), the change must have some positive effect on a human mind. We upright apes have learned to "exploit" the available Energy of our local physical system to serve our own physical fortunes (e.g. technology), and in hopes of advancing our metaphysical interests.

    For living creatures, "equilibrium" is not a good thing. Like sharks, who are said to "swim or die", we humans must evolve or die-out. As a species, we survive by learning better ways to exploit the resources of the world, and to avoid repeating past mistakes, such as incomplete carbon exploitation that leaves behind toxic substances. Human technology has accelerated the progress & stability of evolution, but also the digress & instability --- hence we progress or die. In terms of "things we care about" it's metaphysical evolution, the subject matter of Philosophy. :nerd:

    Quote from OP :
    # the new ‘law of increasing functional information’ states that complex natural systems evolve to states of greater patterning, diversity, and complexity
  • Hanover
    12.2k
    Knowing Euclid's definitions and axioms does not entail knowing Pythagorus' theorem even though it 'follows' from them. The information of the theorem has to be 'unfolded' from the axioms by a particular series of steps that are not specified by the axioms themselves. Similarly, the unfolding of physical processes in time produces new information even if that information is predetermined. If you like, existence is the unfolding of God's omniscience.unenlightened

    Interesting. Would Mary learn something new when she saw red if Mary were God?

    I'll have to think about that one. Omniscience entails knowledge of everything and if certain knowledge is only knowable through experience, then we'd have to say that omniscience entails omni-experience, meaning you'd have had to experience everything to know everything, but it seems a limitation on God to require he do something to know something.

    As to the logical implications entailed by certain axioms, I do think they'd be immediately known to an omniscient being. I also don't see that as an example of an unfolding because it's just the drawing out of logical deductions, not the revelation through empirical means.

    Everything starts simple and gets more complex and order is simplicity.unenlightened

    The crux of my disagreement is that you make order synonymous with simplicity and chaos synonymous with chaos. Under this view, the primordial pre-bang mass woudl be the most perfect example of order and what followed the big bang would be that of increasing chaos. My position is that order is not simple, but within it all possibliites are contained.
  • Wayfarer
    21k
    Energy dissipates, disorder/information increasesunenlightened

    Maximal order is minimal total informationunenlightened


    How does that follow? Information is ordered, isn't it?
  • unenlightened
    8.8k
    Yes, It seems like a necessary truth that omniscient beings cannot learn.

    The crux of my disagreement is that you make order synonymous with simplicityHanover

    It's not my theory. It's Shannon's.

    Information is orderWayfarer

    I understand this claim, it seems intuitively true, but it isn't, the opposite is true, and the thought experiment above was intended to convince you. I'll have another go.

    Consider a computer image let's say 100 by 100 pixels, black and white so 10,000 bits. There are a lot of different ways this can be ordered; I will consider one simple one, where the bottom half is a repeat of the top half. It is surely immediately clear, that the information content is halved?

    And any order involves repetition with or without reflection, inversion, negation, etc. and any repetion is a reduction in the information. This is the principle that allows data compression. The information that we are interested in is almost always ordered and structured, and to the extent it is ordered, it can be compressed. the result of compression is a smaller less ordered amount of information that decompresses to the original file (we hope, barring copy errors).
  • Wayfarer
    21k
    Consider a computer image let's say 100 by 100 pixels, black and white so 10,000 bits. There are a lot of different ways this can be ordered; I will consider one simple one, where the bottom half is a repeat of the top half. It is surely immediately clear, that the information content is halved?unenlightened

    If the same image is repeated twice, then I suppose the same information is presented twice. But the correct comparision is between any image and random number of pixels with no image. Isn't it the case that the latter contains, and therefore conveys, no information?
  • unenlightened
    8.8k
    Isn't it the case that the latter contains, and therefore conveys, no information?Wayfarer

    No it isn't the case. What is the case is that you are not interested in the information, but in the patterns and orders. But if you happened to be a computer, you would read that random pattern like a QR code.
  • Wayfarer
    21k
    What is the case is that you are not interested in the information, but in the patterns and orders. But if you happened to be a computer, you would read that random pattern like a QR code.unenlightened

    Can't see it. If I zero out a hard drive, then it's physically the same as a hard drive with a thousand gigabytes of information. But there is no order. A 'random pattern' is oxymoronic, as patterns are by definition not random, but ordered. A pattern is 1. a repeated design 'decorate with a repeated design.
    "he was sitting on a soft carpet patterned in rich colours"

    2. give a regular or intelligible form to.
    "the brain not only receives information, but interprets and patterns it"
  • unenlightened
    8.8k
    Can't see it. If I zero out a hard drive, then it's physically the same as a hard drive with a thousand gigabytes of information. But there is no order.Wayfarer

    No, there is perfect order, like a blank piece of paper. Perfect order, and no information. The singularity before the bang. Information is written onto a piece of paper or onto a hard drive, bringing disorder to perfect symmetry. I sense your shock and dismay, but it is just a change of view.
  • unenlightened
    8.8k
    A 'random pattern' is oxymoronic, as patterns are by definition not random, but ordered.Wayfarer

    You're right, but wrong about this. the order is pre-specified as "a square of pixels 100 by 100 each either black or white" . So any combination whatsoever that you see as 'random' can be read as a unique meaningful image by a computer - a language of 2 raised to the power of 10,000 words. That's a huge number.
  • Janus
    15.7k
    Yes, you are right, and @Wayfarer is wrong here: it has to do with how much information would be needed to specify the positions of all the particles in a completely random arrangement of particles as opposed to a strictly geometric configuration.
  • unenlightened
    8.8k
    Thanks. I'm sympathetic to @Wayfarer's difficulty though, it has taken me years to really reconcile Shannon information theory with the entropy of physics in my own mind, and I have yet to come across a really clear and concise exposition.
  • Wayfarer
    21k
    So any combination whatsoever that you see as 'random' can be read as a unique meaningful image by a computerunenlightened

    Nothing is meaningful to a computer, though. You could program the computer to register any combination of pixels as a representation, but that would require an intentional act on the part of the programmer.
  • Janus
    15.7k
    Right it is a difficult thing to grasp and I certainly cannot claim anything more than a very basic understanding of entropy.

    It might help to make it clearer if you substitute 'arrangement' or 'configuration' for 'pattern'. The amount of information required to specify the positions of and relations between an ordered arrangement is obviously less than the information required to specify a random arrangement.

    In cosmology the idea is that the microwave background state, which is believed to have been almost entirely uniform can be described much more simply than the subsequent states of galaxy and star formation.

    So, the information has obviously increased in the latter case due to the initially minor variations becoming magnified over time. It is only energy that allows the formation of "islands" of relative order or negentropy, such as galaxies, stars, solar systems, planets and, of course, organisms. The theory is that disorder will increase as energy is dissipated and the entire universe has pulled apart and reached thermal equilibrium, a state in which particles will be scattered in a disorderly way everywhere.
  • Wayfarer
    21k
    it has taken me years to really reconcile Shannon information theory with the entropy of physics in my own mind, and I have yet to come across a really clear and concise exposition.unenlightened

    You know the famous anecdote of how the relationship between entropy and information theory was drawn, right? John Von Neumann was a professional colleague of Shannon.

    What’s in a name? In the case of Shannon’s measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: ‘My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.”

    From here.

    Unfortunately knowledge of that anecdote tends to undermine the advantage!

    In any case, Shannon’s famous information theory is specific to a context, namely, the transmission of information via a medium. It was his work, as you said, which made data compression possible. But I’m failing to see the relevance that has to the general relationship between order and entropy.

    As I understand it, entropy is a measure of the degree of disorder or randomness in a system. An increase in entropy typically corresponds to a decrease in order: as a system becomes more disordered, its entropy increases. Conversely, when a system is more ordered, its entropy is lower. This concept is a cornerstone of the second law of thermodynamics, which states that the total entropy of an isolated system can never decrease over time.

    So far so good?

    In information theory, the information content of a message is related to its unpredictability or its entropy (per Shannon). A completely random set of symbols contains a lot of potential information but it does not contain meaningful structure or order.

    For example, if rocks are arranged to spell out a message, the specific arrangement reduces the entropy of the system in terms of its information content because it is now a highly ordered state compared to a random pile. The ordered arrangement conveys meaning (information) to someone who can interpret the pattern of rocks as letters and understands the language. So in this context, the order is directly related to the ability to convey information.

    In summary, while high entropy (disorder) suggests high information content in terms of unpredictability, meaningful information is actually conveyed through ordered and structured arrangements that have lower entropy compared to a random state. So, greater information content corresponds with lower entropy.
  • Wayfarer
    21k
    The amount of information required to specify the positions of and relations between an ordered arrangement is obviously less than the information required to specify a random arrangement.Janus

    Again, this is against the background of information transmission and error correction.

    Aha, now I’m getting it:

    Maximal order is minimal total informationunenlightened

    That would be minimum total information required to encode and transmit that information.

    Like, if you wanted to encode and transmit white noise, you couldn’t do it, because it would be computationally irreducible, as there is no pattern or repetition to capture. You would need to send a 1:1 reproduction of that exact white noise. But, given written information, then you can reduce a very large body of text, because you’re able to assume grammar, vocabulary and syntax which can be used to represent that text. That’s what makes it compressible. That’s what Shannon’s law enabled and why it was fundamental to what came to be known as data compression (which we’re all using every second on these devices).
  • Janus
    15.7k
    Yes, there is a kind of paradox there, because we find that ordered phenomena have much more significant information to impart to us than random phenomena. (I believe @unenlightened already made this point). It is more meaningful to us because we are part of the ordered phenomena. Total randomness or noise is useless to us. And it is only highly ordered phenomena like the higher animals who can "decode" complex information. I find this relationship between entropy, negentropy and information is the most fascinating area to investigate. Not that I've gotten far yet...
  • Wayfarer
    21k
    Right. And a major part of the point of this paper is that these increases in order and information density are subject of a natural law not implied by the previously-known laws of physics. I mean, I still don’t know if they’re right about that, but it seems a significant theory to me.
  • Janus
    15.7k
    I'll have to try to find the time to read the paper...but I agree with unenlightened in highly recommending Bateson.
  • Wayfarer
    21k
    Yes I’ve encountered Steps to an Ecology of Mind before but didn’t read much of it.

    Have a look at @Gnomon’s OP, he does a good job of capturing some of the key points. Also that post from @Joshs about Kaufmann. (I bought Kaufmann’s book At Home in the Universe in the 1990’s but it contained too much heavy-duty biological science for my limited education ;-) )
  • Wayfarer
    21k
    All that said, I looked at the video summary (crappy because of the low-quality computer-generated voice and editing glitches) and I can really see how eyes might roll at this presentation - particularly its bankrolling by the Templeton Foundation. They’ve got wads of cash to give away to those who produce material favourable to their syncretist worldview. (Not that they’re all wrong on that account, past winners include Paul Davies, who’s a favourite of mine, and they do produce some interesting ideas.)

    As anyone will know, I’m fiercely opposed to physicalist reductionism, but for some reason this paper doesn’t ring true for me. I suppose, from a philosophical perspective, a critique of reductionism needs to be much simpler than trying to prove a kind of ‘law of increasing complexity’ operating throughout the Universe. Can’t quite put my finger on why yet, but that’s my feeling at this point.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment