• fdrake
    6.5k


    Entropy is absolutely well defined. It's just defined in different ways. There are multiple entropies. They mean different things. They have different formulas. They can relate. The way you use entropy probably isn't well defined yet, it has shades of all of the ones I detailed in both posts, and to speak univocally about entropy as you do is to blur out the specificities required in each application. The same goes for degrees of freedom.
  • apokrisis
    7.3k
    Entropy is absolutely well defined.fdrake

    What's your single sentence definition then? I mean, just for fun.
  • fdrake
    6.5k


    The point of that post was to highlight that there isn't a univocal sense of entropy, yet.
  • apokrisis
    7.3k
    Yeah. But just have a go. Let's see what you could come up with. It truly might help to make sense of your attacks on mine.

    If instead you really want to say that entropy is simply whatever act of measurement we care to construct as its instrumental definition - that there is no common thread of thought which justifies the construct - then how could you even begin to have an intelligent discussion with me here?
  • AngleWyrm
    65
    Entropy
    What's your single sentence definition then? I mean, just for fun.apokrisis

    Entropy is the complementary antithesis of order, a synonym to disorder and disarray. A snowflake melting, exothermic release of energy, a battery resolving to no charge, water settling at sea level, a magnet losing it's cohesion.
  • fdrake
    6.5k


    It's actually a lot of work just to research the background to what you're saying. So I think I have to break up my responses into a series.

    A univocal sense of entropy would require the instrumental validity of its applications. This is similar to saying that a measurement of how depressed someone is has to take into account the multiple dimensions of depression - the affective, behavioural and psychomotor components. Validity has three aspects; construct validity, content validity and criterion validity. And these have a few sub-aspects.

    Construct validity: the variable varies with what it's supposed to. If you asked a depressed person about how much they liked football, and used that as a measurement of how depressed they are, this measurement would have low construct validity. Construct validity splits into two forms, discriminant validity and convergent validity. Discriminant validity is the property that the variable does not vary with what it's not supposed to - the football depression scale I alluded to above has low discriminant validity since it would be sensitive to the wrong things. It also has low convergent validity, since if its correlation with a real measure of depression was computed, it would be very low. Convergent validity is then the property that a measure varies with what it's supposed to vary. I think that convergent validity of a group of measures (say measures for depression absence, happiness, contentment) consists in the claim that each can be considered as a monotonic (not necessarily linear as in correlation) function of the other.

    Content validity: the variable varies in a way which captures all aspects of a phenomenon. The football scale of depression has essentially no content validity, a scale of depression when the psychomotor effects of depression are not taken into account has more, a scale of depression which attempts to quantify all effects of depression and does it well has high content validity.

    Criterion validity: the variable varies with outcomes that can be predicted with it. Imagine if someone has taken a test of depression on the football scale and a good scale. Now we administer a test of 'life contentment', high scores on the good depression scale would generally occur with low scores on the life contentment scale. Scores on the football scale will have little or no relationship to the life contentment scale measurements.

    So you asked me if I can provide a univocal definition of entropy. I can't, nevertheless I insist that specific measures of entropy are well defined. Why?

    Measures of entropy are likely have high construct validity, they measure what they're supposed to. Let's take two examples - ascendency and Shannon biodiversity:

    The ascendency is a property of a weighted directional graph. The nodes on such a graph are relevant ecological units - such as species groups in a community. The weights in the graph are measurements of the transfer between two nodes. Let's take an example of wolves, rabbits and grass and construct a food web; assuming a single species for each, no bacteria etc...

    Wolves: have an input from rabbits.
    Rabbits: have an input from grass and an output to wolves.
    Grass: has an input from the sun and an output to rabbits.

    Assume for a moment that the energy transfer is proportional to the biomass transfer. Also assume this ecosystem is evaluated over a single day. Also assume that the wolves extract half as much biomass from the rabbits as the rabbits do from the grass, and the rabbits extract half the energy from the grass that the grass does from the sun; and that grass extracts '1' unit from the sun (normalising the chain).

    Then:

    Transfer(Sun,Grass)=1
    Transfer(Grass,Rabbits)=0.5
    Transfer(Rabbits,Wolves)=0.25

    Denote transfer as T. The ascendency requires the computation of the total throughput, - the sum of all weights, here 1.75. We then need the average mutual information. This is defined as:



    Where is the total of the flows going from i to others, and the reversed index is the flows going from others to j. Which I'm not going to compute since the actual value wont provide anything enlightening here - since it won't help elucidate the meaning of the terms. The average mutual information, roughly, is a measure of the connectivity of the graph but weighted so that 'strong' connections have more influence on MI than 'weak' ones.

    The ascendency is then defined as:



    What does this measure? The diversity of flows within a network. How? It looks at the proportion of each flow in the total, then computes a quantification of how that particular flow incorporates information from other flows - then scales back to the total flow in the system. It means that the diversity is influenced not just by the number of flows, but their relative strength. For example, having a network that consisted of 1 huge flow and the rest are negligible would give an ascendency much closer to a single flow network than another measure - incorporating an idea of functional diversity as well as numerical biodiversity. Having 1 incredibly dominating flow means 0 functional diversity.

    The ascendency can also be exponentiated to produce a measure of the degrees of freedom of the network. Having 1 incredibly dominating flow means 0 MI, so 0 ascendency, so the exponential of the ascendency is:



    IE one 'effective degree of freedom'. Ulanowicz has related this explicitly to the connectivity of digraphs in 'Quantifying the Complexity of Flow Networks: How many roles are there?'. It's behind a paywall unfortunately. If an ecological network has flows equidistributed on the organisms - each receiving an equal portion of the total flow -, then it will have the same effective degrees of freedom as the number of nodes (number of organism types) in the network. When an unequal portion of total flow is allocated to each, it will diverge from the number of nodes - decreasing, since there's more functional concentration in terms of flow in the system.

    Numerically, this would be equal to the exponentiated Shannon Biodiversity index in an ecosystem when the species are present in equal numbers. To see this, the Shannon Biodiversity is defined as:



    Where every is the proportion of the i-th species of the total. This is a numerical comparison of the relative abundance of each species present in the ecosystem. This obtains a maximum value when each species has equal relative abundance, and is then equal to the number of species in the ecosystem. Look at the case with 2 species each having 2 animals. p is constant along i, being 0.5, then the Shannon Biodiversity is -2*0.5*log(0.5) = log2, so its exponential is 2.

    Critically this 2 means something completely different from the effective degrees of freedom derived from the flow entropy. Specifically, this is because there are equal relative abundances of each species rather than equal distribution of flow around the network. The math makes them both produce this value since they are both configurational (Shannon) entropies - and that's literally how they were designed to work.

    If we were to take both of these measures individually and assess them for content validity - they'd probably be pretty damn good. This is because they are derived in different constrained situation to be sensitive to different concepts. They adequately measure flow diversity and relative abundance biodiversity. If you take them together - you can see they will only necessarily agree when both the flows and the numbers are distributed equally among all species in the network. This means low construct validity on a sense of entropy attempting to subsume both. It just won't capture the variability in both of them. I'm being reasonably generous here, when the degrees of freedom notion ascendency theory has was applied across a eutrophication gradient, which I assume you will allow as an entropic gradient, the ascendency degrees of freedom varied in an upside down U shape from low eutrophication to high eutrophication - so it doesn't have to agree with other (more empirically derived) concepts of 'flow concentration' (more nutrients to go to plants, less water oxygen, a possible drop in diversification). IE, the middle ground between low and high eutrophication had the highest ascendency degrees of freedom, not either extreme.

    I think this is actually fine, as we already know that 'intermediate's are likely to be closer to equidistribution of flows than extremes so long as they contain the same species. The paper puts it this way:

    In the light of these results, the network defi-nition of eutrophication (Ulanowicz, 1986) does not appear to accord with the gradient in eutrophicationin the Mondego estuarine ecosystem. Rather, it would seem more accurate to describe the effects of eutrophication process in this ecosystem in terms of a disturbance to system ascendency caused by an intermittent supply of excess nutrients that, when coupled with a combination of physical factors (e.g. salinity, precipitation, etc), causes both a decrease in system activity and a drop in the mutual information of the flow structure. Even though a significant rise in the total system throughput does occur during the period of the algal bloom and does at that time give rise to a strong increase of the system ascendency, the longer-term, annual picture suggests instead that the non-bloom components of the intermediate and strongly eutrophic communities were unable to accommodate the pulse in production. The overall result was a decrease in the annual value of the system TST and, as a consequence, of the annual ascendency as well.


    Of course, if you've read this far, you will say 'the middle state is the one furthest from order so of course it has the highest degrees of freedom', which suggests the opposite intuition from removal of dominant energy flows 'raining degrees of freedom' down onto the system. This just supports the idea that your notion of entropy has poor construct validity.

    Your notion of entropy has very good content validity, since you will take any manifestation of entropy as data for your theory of entropy, it of necessity involves all of them. However, since we've seen that the construct validity when comparing two different but related entropic measures of ecosystem properties is pretty low, your definition of entropy has to be able to be devolved to capture each of them. And since they measure different things, this would have to be a very deep generalisation.

    The criterion validity of your notion of entropy is probably quite low, since your analogies disagree with the numerical quantity you were inspired by.

    This is just two notions of entropy which have a theoretical link and guaranteed numerical equality on some values, and you expect me to believe that it's fruitful to think of entropy univocally when two similar measures of it disagree conceptually and numerically so much? No, Apo. There are lots of different entropies, each of them is well defined, and it isn't so useful to analogise all of them without looking at the specifics.

    Edit: If you want me to define entropy univocally, it's not a game I want to play. I hope the post made it clear that I don't think it's useful to have a general theory of entropy which provides no clarity upon instantiation into a context.

    So about the only thing I can say is that:

    Entropy = something that looks like Shannon Diversity.
  • fdrake
    6.5k


    If instead you really want to say that entropy is simply whatever act of measurement we care to construct as its instrumental definition - that there is no common thread of thought which justifies the construct - then how could you even begin to have an intelligent discussion with me here?

    Funnily enough, it's precisely the common thread between different notions of entropy that makes me resist trying to come up with a catch-all definition of it. This is that entropy is a parametrised concept when it has any meaning at all. What does a parametrisation mean?

    Will begin with a series of examples, then an empirical challenge based on literature review. Shannon entropy, Boltzmann entropy, ascendency, mutual information - these are all functions from some subset of n-dimensional real space to real-space. What does this mean? Whenever you find an entropic concept, it requires a measure. There's a need to be able to speak about low and high entropy arrangements for whatever phenomenon is being studied.

    So-find me example of an entropy in science that isn't parametrised. I don't think there are any.

    Examples - ascendency as an entropy is a mapping from n-dimensional real space to the real line where n is the number of nodes in the ecosystem network. Shannon Diversity is a mapping from n-length sequences of natural numbers to the real line, where n is the number of of species in an ecosystem. Gibbs entropy is the same in this sense as Shannon Diversity. From thermodynamics to ecological infodynamics, entropy is always something which is spoken about in degrees, and when qualitative distinctions arise from it - they are a matter of being emergent from differences in degree. Differences in values of the entropy.

    You said you didn't mind if I believed your descriptions of entropy are purely qualitative - the problem is that they are not purely qualitative. You speak about entropic gradients, negentropy, entropy maximisation without ever specifying the entropy of what and how the entropy is quantified - or even what an entropy gradient is. Nevermind what 'entropification' is, but more on that later... Anyway, back to the commonalities in entropy definitions.

    So a commonality is that they are mappings from some space to the real line. But what matters - what determines the meaning of the entropy is both what the inputs to the entropy function are and how they are combined to produce a number. To speak of entropy in general is to let the what and the how vary with the implicit context of the conversation; it destroys the meaning of individual entropies by attempting to unify them, the unification has poor construct validity precisely because it doesn't allow the what and the how of the mapping to influence the meaning.

    So when you say things like:

    So the normal reductionist metaphysical position is that degrees of freedom are just brute atomistic facts of some kind. But I seek to explain their existence. They are the definite possibilities for "actions in directions" that are left after constraints have had their effect. So degrees of freedom are local elements shaped by some global context, some backdrop history of a system's development.

    In a trivial sense, degrees of freedom are local elements shaped by some global context. You index to the history of a system as if that gives 'its' entropy a unique expression. You can see that this just isn't the case by comparing the behaviour of Shannon Entropy and ascendency - they have different behaviours, they mean different things, they quantify different types of disorder of an ecosystem. And after this empty unification of the concept of entropy, you give yourself license to say things like this:

    So evolution drives an ecology to produce the most entropy possible. A senescent ecology is the fittest as it has built up so much internal complexity. It is a story of fleas, upon fleas, upon fleas. There are a hosts of specialists so that entropification is complete. Every crumb falling off the table is feeding someone. As an ecology, it is an intricate hierarchy of accumulated habit, interlocking negentropic structure. And then in being so wedded to its life, it becomes brittle. It loses the capacity to respond to the unpredictable - like those either very fine-grain progressive parameter changes or the out of the blue epic events

    'Evolution drives an ecology to produce the most entropy possible' - could be viewed in terms of Shannon Entropy, Exergy, Functional Biodiversity.

    'A senescent ecology is the fittest as it has built up so much internal complexity' - could be viewed in terms of Shannon Entropy, Exergy, Functional biodiversity.

    'It is a story of fleas, upon fleas, upon fleas' - is now apparently solely a network based concept, so it's a functional biodiversity.

    'There are hosts of specialists so that entropification is complete' - this makes sense in terms of numerical biodiversity - relative abundances.

    'Every crumb falling off the table is feeding someone.' - this makes sense in terms of functional diversity, like ascendency.

    'As an ecology, it is an intricate hierarchy of accumulated habit, interlocking negentropic structure'

    And when you say negentropy, you mean configurational entropy, except that means it's nothing about ascendency any more.

    '. And then in being so wedded to its life, it becomes brittle. It loses the capacity to respond to the unpredictable - like those either very fine-grain progressive parameter changes or the out of the blue epic events'

    I mean, your 'it' and 'unpredictable' are ranging over all available entropy concepts and all possible perturbations to them. You can look at the example of applying ascendency and exergy along a eutrophication gradient to see that such breadth generates inconsistencies.

    Then switching to that informational or negentropic side of the deal - the tale of life's dissipative structure - the degrees of freedom become the energy available to divert into orderly growth. It is the work that can be done to make adaptive changes if circumstances change.

    Now the degrees of freedom are solely a concept of exergy and available energy? Jesus man. The problem here isn't just that you're equivocating on a massive scale, it's that changes in different entropy measures mean different things for the dynamics of a system.

    Life is spending nature's degrees of freedom in entropifying ambient energy gradients. And it spends its own degrees of freedom in terms of the work it can extract from that entropification - the growth choices that it can make in its ongoing efforts to optimise this entropy flow.

    I could find two references for 'entropification' - and neither of them are in an ecological context, they're a process for estimating orderliness of errors in statistical models. One of them is an applied example, one of them is looking at it in terms of stochastic geometry. I mean, there's no clear sense of entropification to have. It could refer to any of them, but you probably want it to resemble exergy the most here. And through some analogy to thermodynamics, you'll think this has an accessible meaning. How does entropification work?

    You earlier say:

    So the degrees of freedom are the system's entropy. It is the through-put spinning the wheels.

    This relies upon the exponentiation of a particular entropy measure. As you saw, this idea isn't a unified one - and unification produces terrible construct validity. The degrees of freedom are something of the what of entropy, not the how. You can use the how to look back at the what, but not without context.

    Every process described in your post is a placeholder. It reminds me of my favourite sentence I've ever read in academic literature. It is a howler:

    During the search phase, subtask relevant teabag features become attentionally prioritised within the attentional template during a fixation.

    This is supposed to serve as an example of how different features of an object become relevant and become looked at for a while through the progression of a task. What they actually did was take a description of the process in general:

    During the search phase, subtask relevant features become attentionally prioritised within the attentional template during a fixation.

    And then thought 'this would be much clearer if we substituted in teabag':

    During the search phase, subtask relevant teabag features become attentionally prioritised within the attentional template during a fixation.

    How loosely you treat entropy makes almost everything you say a subtask relevant teabag feature. It is an example from a promised theory which has not been developed.

    Edit: the authors of subtask relevant teabag features actually did develop a theory, though.
  • fdrake
    6.5k


    If my responses meet your standard of 'intelligent discussion', feel free to respond at this point.
  • Deleteduserrc
    2.8k
    That blog post was fascinating! I keep wandering back to psychoanalytic point where the cheating husband's relationship with his lover only 'works' insofar as he is married: were he to leave his wife for the sake of his lover, the lover would no longer be desirable... Of course the psychoanalytic lesson is that our very 'subjective POV' is itself written into the 'objective structure' of things: it's not just window dressing, and if you attempt to discard it, you change the nature of the thing itself.

    And I think this slipperiness is what makes it so hard to fix the status of a 'parameter': if you want to make a parameter 'work' (i.e. if you intervene in a system on that basis), you will cause changes - but that doesn't mean the system is 'in-itself' sensitive to such parameters: only that, through your intervention you've made it so.
    StreetlightX

    Glad you liked it. It's part of larger 'series' ( called 'uruk machines', organized in the archive section) that tries, ambitiously, to synthesize 4 thinkers in order to create a Big Metanarrative ala 'how the west got where it is now.' It's pretty fascinating, whatever you think of his conclusions. The author admits, in a footnote or comment somewhere, that he's trying to trojan-horse continental insights using a rational idiom - and I think he largely succeeds. Definitely worth a read.

    This thread has long since reached the escape velocity necessary to go irretrievably over my head, but that's ok. Even if I'm left dick-in-my-hands fumbling with basic Hegelian concepts, it's comforting to know that fdrake is still killing it with the hard applied mathematics and that apokrisis is still 1/x crisping every villain who crosses his path like someone who tapes down the 'x' button to grind out xp while he sleeps. It means everything is still progressing according to some familiar order.

    "Perhaps there remains/
    some tree on a slope, that we can see/
    again each day: there remains to us yesterday’s street/
    and the thinned-out loyalty of a habit/
    that liked us, and so stayed, and never departed."


    So all that being said, acknowledging I can't keep up with the math, I'm still confident enough to engage the OP on its own terms which are, I believe, metaphorical. Which isn't to say I think you think that self isn't literally an ecosystem - I believe you do, and I probably agree - but that I think the significance of this way of looking at the self ultimately relies on - and is motivated by- what can be drawn from it conceptually. It's about drawing on empirically-sourced models to the extent that they facilitate conceptual considerations. It's metaphorical in the literal sense that we're transporting some way of thinking from one level to another.

    And what we have conceptually is something like: the self is a hierarchically organized collection of processes that can either be too open to the outside at the risk of being overwhelmed or too entrenched against the outside at the risk of brittle collapse. Basically chaos vs order.

    As apo said, this essentially cashes out in goldilocks terms. If this isn't about the nuts and bolts of any actual ecosystem, this is really just a metaphor for: not too open, not too closed cf ecosystems.

    So why now? why here? What's being said, really?

    To get political: isn't not too closed, not too open, self-regulating while allowing lines of flight - i mean isn't that, in a perfect nutshell, neoliberalism (multiculturalism, whatever)?

    I want you to be a bloodless academic punching bag, conceptually defending the current order by means of weak intellectual metaphors that conceal your own place in the system. That would satisfy me to no end. It would mean it's ok I dropped out.

    You're not doing the real-ecosystem math thing fdrake is doing, even if you're drawing from his insights when it helps your case. So what are you doing? Prove me wrong! Is there any sense in which your metaphors don't serve the default academic order? Zizek and Deleuze and whoever else reduced to a serving a niche in the web of citation bread-crumbs etc etc. (get attention by drawing on an unexpected source, make your mark by bringing him ultimately back into the fold.)
  • TimeLine
    2.7k
    We're basically a series of loops, some only residing 'inside' us, some extending far beyond our skin.StreetlightX

    I didn't get a chance to read everything, but in the case of thermodynamic systems, the evolution of any given system is determined toward a state of equilibrium, and ergodicity attempts to ascertain the averages of behaviour within a system (transformations, arbitrary convergence, irreducibility etc) and political systems are an attempt to order the nature of Hobbesian chaos. I really like this:

    A baby girl is mysteriously dropped off at an orphanage in Cleveland in 1945. “Jane” grows up lonely and dejected, not knowing who her parents are, until one day in 1963 she is strangely attracted to a drifter. She falls in love with him. But just when things are finally looking up for Jane, a series of disasters strike. First, she becomes pregnant by the drifter, who then disappears. Second, during the complicated delivery, doctors find that Jane has both sets of sex organs, and to save her life, they are forced to surgically convert “her” to a “him.” Finally, a mysterious stranger kidnaps her baby from the delivery room.

    Reeling from these disasters, rejected by society, scorned by fate, “he” becomes a drunkard and drifter. Not only has Jane lost her parents and her lover, but he has lost his only child as well. Years later, in 1970, he stumbles into a lonely bar, called Pop’s Place, and spills out his pathetic story to an elderly bartender. The sympathetic bartender offers the drifter the chance to avenge the stranger who left her pregnant and abandoned, on the condition that he join the “time travelers corps.” Both of them enter a time machine, and the bartender drops off the drifter in 1963. The drifter is strangely attracted to a young orphan woman, who subsequently becomes pregnant.

    The bartender then goes forward 9 months, kidnaps the baby girl from the hospital, and drops off the baby in an orphanage back in 1945. Then the bartender drops off the thoroughly confused drifter in 1985, to enlist in the time travelers corps. The drifter eventually gets his life together, becomes a respected and elderly member of the time travelers corps, and then disguises himself as a bartender and has his most difficult mission: a date with destiny, meeting a certain drifter at Pop’s Place in 1970.

    The question is: Who is Jane’s mother, father, grandfather, grand mother, son, daughter, granddaughter, and grandson? The girl, the drifter, and the bartender, of course, are all the same person. These paradoxes can made your head spin, especially if you try to untangle Jane’s twisted parentage. If we draw Jane’s family tree, we find that all the branches are curled inward back on themselves, as in a circle. We come to the astonishing conclusion that she is her own mother and father! She is an entire family tree unto herself.

    If the universe is infinite, so are the possibilities and thus if we were to arrange - again in a statistically thermodynamic manner - the atoms and neurons that make you (your brain) we could easily replicate 'you' as in, the very you and not just merely the body (memories, feelings) and why open systems are intriguing to me. I guess we need to draw the line somewhere as is the case with Boltzmann' Brain.
  • apokrisis
    7.3k
    What does 'dichotomous to constraints' mean?

    There are lots of different manifestations of the degrees of freedom concept. I generally think of it as the dimension of a vector space - maybe calling a vector space an 'array of states' is enough to suggest the right meaning. If you take all the vectors in the plane, you have a 2 dimensional vector space. If you constrain the vectors to be such that their sum is specified, you lose a degree of freedom, and you have a 1 dimensional vector space. This also applies without much modification to random variables and random vectors, only the vector spaces are defined in terms of random variables instead of numbers.
    fdrake

    In mechanics, degrees of freedom are a count of the number of independent parameters needed to define the configuration of a system. So your understanding is correct.

    And they are dichotomous to constraints as they are what are left over as a result of a configuration being thus limited. Constraint suppresses freedoms. What constraint doesn't suppress then remains to become some countable degree of freedom for that system.

    Then from an infodynamic or pansemiotic point of view, constraints become the informational part of the equation, degrees of freedom are the dynamics. In the real material world, the configuration can be treated as the knowledge, the structure, that the organismic system seeks to impose on its world. The constraints are imposed by a mind with a purpose and a design. The degrees of freedom are then the entropy, the dynamics, that flow through the organism.

    So a structure has to be imposed on the flow to in fact create a flow composed of some set of degrees of freedoms. A bath of hot water will simply cool by using its surrounds as a sink. An organism wants to build a machinery that stands inbetween such a source and sink so as to extract work along the way.

    That is why I suggest ATP as a good way to count degrees of freedom in biology. It is the cell's meaningful unit of currency. It places a standard cost on every kind of work. It defines the dynamical actions of which a cell is composed in a way that connects the informational to the entropic aspects of life. An ATP molecule could be spent for any purpose. So that is a real non-physical freedom the cell has built for itself.

    An ATP molecule can be used to make a kinesin "walker" transport molecule take another step, or spin the spindle on ATP-ase. But then the spending of that ATP has an actual entropic cost as well. It does get used up and turned into waste heat (after the work is done).

    So degrees of freedom are what constraints produce. And in living organisms, they are about the actions that produce units of work. The constraints are then the informational structure that regulates the flow of material entropy, channelling some source to a sink in a way that it spins the wheels of a cellular economy along the way.

    A cooling bath of hot water lacks any interesting informational structure apart from perhaps some self-organised convection currents. Like a Benard cell, it might have its thermodynamic flow improved by emergent constraints producing the organised currents that are now some countable set of degrees of freedom. A more chaotic path from hot to cold has had its own vaguer collection of degrees of freedom suppressed so the flow is optimised by a global structure.

    But life has genes, membranes, pores, switches, and a host of molecular machinery that can represent the remembered habits of life - some negentropic or informational content - that produces a quite intentional structure of constraints, a deliberately organised set of degrees of freedom, designed to extract self-sustaining work from any available entropy gradient.

    So I suppose I should talk about configurational entropy.fdrake

    Yep. But note that biosemiosis is about how life has the memory to be in control of its physical configuration. It uses a potential gradient to do the work of constructing itself.

    So that brings in the informational aspect of the deal - Pattee's epistemic cut. The organism first insulates itself from dynamics/entropy by creating its own informational degrees of freedom. It does this by using a genetic code. But also, it does it foundationally down at the level of the dynamics itself in having "a unit of work" in an ATP molecule that can be used "to do anything a cell might want".

    What gets configured is not just some spatial geometry or thermal landscape. The material world is actually being inscribed by an organism's desires. The dynamics is not merely self-organising. It is being organised by a proper self.

    It is this situation which your notions of degrees of freedom don't really cover. You are not accounting for the epistemic cut which is the added semiotic feature of this material world now. If you are going to demand quantitative measures, the measures have to span the epistemic cut in some fashion. You are trying to make measurements that only deal with one of the sides of the equation.
  • fdrake
    6.5k


    Will you be giving a series of replies? Should I wait?
  • apokrisis
    7.3k
    Up to you. It's a nice day outside. I doubt I will tick off every point you raised.
  • fdrake
    6.5k


    Will wait a bit to see what you do, and to digest the post.
  • apokrisis
    7.3k
    The Shannon Entropy is related to the Boltzmann entropy in thermodynamics in a few ways I don't understand very well.fdrake

    What's wrong with a reciprocal relation? If Shannon entropy is the degree of surprise to be found in some system, then the Boltzmann entropy is the degree to which that system is in its least surprising state.

    So if a system is constrained and is thus composed of some set of independent elements, states, or events, its arrangement can be described somewhere on a spectrum between maximally surprising and minimally surprising. An unsurprising arrangement requires the least amount of information to specify it. And it thus represents the most entropic arrangement.
  • fdrake
    6.5k


    Shannon's strictly broader than Boltzmann since it allows for non-equidistribution. Gibbs and Shannon are almost equivalent, or rather it can be said that Gibbs entropy is Shannon entropy applied to the distribution of microstates in a macrostate which do not necessarily have equal probability.

    I said I didn't understand it very well because I don't know what conclusions people are going to draw by blurring the boundaries between them.
  • apokrisis
    7.3k
    The theoretical links between Shannon's original entropy, thermodynamical entropy, representational complexity can promote a vast deluge of 'i can see through time' like moments when you discover or grok things about their relation. BUT, and this is the major point of my post:

    Playing fast and loose with what goes into each of the entropies and their context makes you lose a lot. They only mean the same things when they're indexed to the same context. The same applies for degrees of freedom.

    I think this is why most of the discussions I've read including you as a major contributor are attempts to square things with your metaphysical system, but described in abstract rather than instantiated terms.
    fdrake

    I'm baffled that you say Shannon entropy came before Boltzmann's entropy.

    But anyway, again my interest is to generalise across the different contextual instantiations of the measurement habits which science might employ. I am indeed interested in what they could have in common. So I don't need to defend that as if it were some problem.

    And as I have pointed out, when it comes to semiosis and its application to the world, we can see that there is a whole level of irreducible complexity that the standard reductionist approach to constructing indices of information/entropy/degrees of freedom just misses out.

    It is fine that science does create simpler indexes. I've no problem with that as a natural pragmatic strategy. But also, with Shannon and Boltzmann, it became clear that informational uncertainty (or configurational degrees of freedom) and entropic material degrees of freedom (or countable microstates) are two sides of the same coin. The mathematics does unite them in a general way at a more abstract level.

    And then when it come to biosemiosis, information and entropy become two sides of a mechanically engineered epistemic cut. We are talking about something at a level above the brute physical realm imagined by the physical discourse that gives us Shannon uncertainty and Boltzmann entropy. It thus needs its own suitable system of measurement.

    That is the work in progress I see in literature. That is the particular story I am tracking here.

    You can keep re-stating that a proper scientist would use the proper tools. You can reel off the many kinds of metrics that reflect the simpler ontology of the reductionist. You can continue to imply that I am somehow being unscholarly in seeking to consider the whole issue at a more holistic level - one that can encompass physicalist phenomena like life and mind. And indeed, even culture, politics, economics, morality and aesthetics.

    But I know what I'm about so I'm only going to respond to your critique to the degree it throws light on the connecting commonality, the linkages to that more holistic worldview.
  • apokrisis
    7.3k
    Shannon's strictly broader than Boltzmann since it allows for non-equidistribution.fdrake

    Does that remain the case now that information theory has been tied to the actual world via holographic theory?

    Boltzmann's k turned out to be physically derived from the dimensionless constants of the Planck scale. And Shannon likewise now represents a fundamental Planckian limit. The two are united via the basic physical limits that encode the Cosmos.

    The volume of a spacetime defines some entropic content. The surface area of that volume represents that content as information. And there is a duality or reciprocality in the relation. There can't be more entropy inside than there are questions or uncertainties that can be defined on a 4 to 1 surface area measure.

    It is about the biggest result of the last 30 years in fundamental physics.
  • fdrake
    6.5k


    I'm baffled that you say Shannon entropy came before Boltzmann's entropy.

    It didn't. Shannon's entropy came after. By throwing in 'original' there I meant Shannon's particular application of entropy to signals and strings.

    It is fine that science does create simpler indexes. I've no problem with that as a natural pragmatic strategy. But also, with Shannon and Boltzmann, it became clear that informational uncertainty (or configurational degrees of freedom) and entropic material degrees of freedom (or countable microstates) are two sides of the same coin. The mathematics does unite them in a general way at a more abstract level.

    It's a stretch between Shannon Biodiversity and Gibbs entropy - there's no equivalent notion of macrostate other than the vector of relative abundances within an ecosystem.

    You can keep re-stating that a proper scientist would use the proper tools. You can reel off the many kinds of metrics that reflect the simpler ontology of the reductionist. You can continue to imply that I am somehow being unscholarly in seeking to consider the whole issue at a more holistic level - one that can encompass physicalist phenomena like life and mind. And indeed, even culture, politics, economics, morality and aesthetics.

    I'm not saying you're being unscholarly because you're looking for commonalities in entropy. I'm saying that your application of entropic concepts has a deaf ear for context. Just like sliding straight from Shannon Biodiversity to Boltzmann and back, the 'macrostate' of an ecosystem parametrised solely in terms of its relative abundances isn't anything like a macrostate in a thermodynamic system.

    You've shown that you can keep it quite well contextualised. Your post on ATP and work extraction to my reckoning has a single working concept of entropy in it - that of its duality to exergy, work extraction. Then you slide into a completely different operationalisation of entropy:

    Then from an infodynamic or pansemiotic point of view, constraints become the informational part of the equation, degrees of freedom are the dynamics. In the real material world, the configuration can be treated as the knowledge, the structure, that the organismic system seeks to impose on its world. The constraints are imposed by a mind with a purpose and a design. The degrees of freedom are then the entropy, the dynamics, that flow through the organism.

    Going from ATP being used to fuel an organism straight to a 'global' sense of infodynamics and signals/signs in pansemiosis. It works only when you wave your hands and don't focus on the specifics. When what before was concrete becomes metaphorical, then what was metaphorical becomes concrete. Just like how you slingshot about with the what the degrees of freedom are.

    I want you to make substantive posts in terms of a unified sense of entropy. I don't want you to achieve that through handwaving and equivocation.

    But I know what I'm about so I'm only going to respond to your critique to the degree it throws light on the connecting commonality, the linkages to that more holistic worldview.

    This is why every time you talk about anything tangentially related to entropy, systems or complexity you make all issues an addendum to your worldview. This is why what we're talking about has almost no relation to the OP.

    Does that remain the case now that information theory has been tied to the actual world via holographic theory?

    I'm not going to pretend to know enough about cosmology to comment.
  • apokrisis
    7.3k
    Going from ATP being used to fuel an organism straight to a 'global' sense of infodynamics and signals/signs in pansemiosis. It works only when you wave your hands and don't focus on the specifics. When what before was concrete becomes metaphorical, then what was metaphorical becomes concrete.fdrake

    Its not metaphorical if infodynamics/semiosis is generalisable to the material dissipative structures in general.

    Again, you might have to actually read the literature - Salthe for instance. But the metaphysical ambition is clear enough.

    The information is separate from the dynamics via the epistemic cut in biosemiotic systems - organisms that are living and mindful. A organisation mediated by signs is perfectly concrete.

    What still counts as speculative is then generalising that concrete description of life/mind so that it is seen to be a concrete theory of the Cosmos. The Universe would be understood as a dissipative system organised by a sign relation. The biological understanding of the duality of information and entropy would prove to apply to the whole of existence as its scientific theory.

    So it is not me waving my hands. It is you demonstrating a deaf ear to context. I am careful to distinguish between the part of what I say which is "normal science" and the part that is "speculative metaphysics". And the speculative part is not merely metaphor because the project would be to cash it out as concrete theory, capable of prediction and measurement.

    I agree that may also be a tall order. But still, it is the metaphysical project that interests me. The fact that you repeatedly make these ad hom criticisms shows that you simply wish not to be moved out of your own particular comfort zone. You don't want to be forced to actually have to think.
  • apokrisis
    7.3k
    This is why what we're talking about has almost no relation to the OP.fdrake

    But the OP was about extracting a political analogy from a lifecycle understanding of ecosystems. I simply responded by saying Salthe's infodynamic perspective gives you a self-explanatory three stage take on that.

    You then got shirty about my use of infodynamic or dissipative theory jargon. I'm happy to explain my use of any terminology. And I'm happy to defend the fact that I do indeed have an over-arching worldview. That is way more than nearly anyone else does around these here parts.
  • apokrisis
    7.3k
    What does this measure? The diversity of flows within a network. How? It looks at the proportion of each flow in the total, then computes a quantification of how that particular flow incorporates information from other flows - then scales back to the total flow in the system. It means that the diversity is influenced not just by the number of flows, but their relative strength. For example, having a network that consisted of 1 huge flow and the rest are negligible would give an ascendency much closer to a single flow network than another measure - incorporating an idea of functional diversity as well as numerical biodiversity. Having 1 incredibly dominating flow means 0 functional diversity.fdrake

    You seem terribly concerned by things that don't seem a big issue from the thermodynamic point of view.

    A lot of your focus seems to be on how to we specify richness or complexity or healthy biodiversity. And you instinctively think in terms of counting niches or something else "structually concrete".

    But the flow view of an ecosystem would see a hierarchy of flow just happening naturally. You would get complexity arising in a way that is essentially "meaningless".

    So think of a scalefree network or other fractal growth model. A powerlaw hierarchy of connectivity will just arise "randomly". It doesn't need evolutionary semiosis or natural selection to create it. The complexity of the flow is not something that needs a designing hand to happen. It is the natural structure of the flow.

    Check out Adrian Bejan's constructal law. He is pretty strong on this issue.

    So a reductionist would think of a richly organised hierarchy of relations as a surprising and delicate state of affairs. But the switch is now to see this as the inevitable and robust equilibrium state of a freely growing dissipative structure, like an ecosystem. Scalefree order comes for free.

    So something like the reason for the occurrence of niches over all scales is not something in need of some "contextualised" metric to index it. We don't have to find external information - some historic accident - which specifies the fact of a hierarchical order. That kind of order is already a natural pattern or attractor. It would take external accidents of history to push it away from this natural balance.

    Where every pipi is the proportion of the i-th species of the total. This is a numerical comparison of the relative abundance of each species present in the ecosystem. This obtains a maximum value when each species has equal relative abundance, and is then equal to the number of species in the ecosystem. Look at the case with 2 species each having 2 animals. p is constant along i, being 0.5, then the Shannon Biodiversity is -2*0.5*log(0.5) = log2, so its exponential is 2.fdrake

    So here, aren't you assuming that we can just count species and have no need to consider the scale of action they might represent? One might be a bacterium, the other an elephant. Both might be matched in overall trophic throughput. We would expect their relative abundance to directly reflect that fact rather than a species count having much useful to say about an ecosystem's healthy biodiversity or state of entropic balance.

    Of course, if you've read this far, you will say 'the middle state is the one furthest from order so of course it has the highest degrees of freedom', which suggests the opposite intuition from removal of dominant energy flows 'raining degrees of freedom' down onto the system. This just supports the idea that your notion of entropy has poor construct validity.fdrake

    Exergy? I mean you seemed to agree that it is about quality of the entropy. So a big tree dropping leaf litter is a rather different story to a forest clearing being blasted by direct sunlight again.
  • fdrake
    6.5k


    Its not metaphorical if infodynamics/semiosis is generalisable to the material dissipative structures in general.

    The bridges between your contextualised uses of entropy are metaphorical. I'll give an apokrisis like description of a process so you can see what I mean. It will be done through a series of thermodynamically inspired haikus just because.

    Organisms lay
    Jumbled up yet striving for
    Their own mouths to feed

    Each cell binds a gap
    Consuming a gradient
    Where to find new food?

    Digging in the weeds
    An old domain of feasting
    For new specialists

    Symmetry broken
    Entropy from exergy
    Degrees of freedom

    So it is not me waving my hands. It is you demonstrating a deaf ear to context. I am careful to distinguish between the part of what I say which is "normal science" and the part that is "speculative metaphysics". And the speculative part is not merely metaphor because the project would be to cash it out as concrete theory, capable of prediction and measurement.

    I don't actually care about this part. Whether what you're doing is science or not doesn't interest me. Nor do I think it's relevant to our disagreement - from my perspective anyway.

    I agree that may also be a tall order. But still, it is the metaphysical project that interests me. The fact that you repeatedly make these ad hom criticisms shows that you simply wish not to be moved out of your own particular comfort zone. You don't want to be forced to actually have to think.

    I thought you'd give me more charity than this. I read quite a few papers on ascendency so I could understand it. If researching something you've referenced to assess its merits is being afraid of thinking, I'll remain a coward. Ascendency is a cool concept, btw.

    A lot of your focus seems to be on how to we specify richness or complexity or healthy biodiversity. And you instinctively think in terms of counting niches or something else "structually concrete".

    I don't. I think that there's no current comprehensive measure of biodiversity or ecosystem complexity that I'm aware of. Quantifying biodiversity and ecosystem complexity has so many things which can be relevant.

    A lot of your focus seems to be on how to we specify richness or complexity or healthy biodiversity. And you instinctively think in terms of counting niches or something else "structually concrete".

    I took two examples to show what happens when you let one slide into another. They were both measures of an aspect of ecosystem complexity. They were designed to measure different things. A unified sense of entropy must contain both, and sliding over their differences isn't sewing them together.

    I'm not suggesting you 'look at the formulas and find a master one', the thing I cared about was that the measures of entropy in terms of ascendency and relative abundance meant different things - they summarise different aspects of the behaviour of the ecosystem.

    So here, aren't you assuming that we can just count species and have no need to consider the scale of action they might represent? One might be a bacterium, the other an elephant. Both might be matched in overall trophic throughput. We would expect their relative abundance to directly reflect that fact rather than a species count having much useful to say about an ecosystem's healthy biodiversity or state of entropic balance.

    It would be interesting if Shannon biodiversity was related to ascendency. There's a trivial link in terms of node counting - and the number of nodes (species) has an influence on ascendency (though not necessarily on its numerical value). Regardless, high entropy states of abundance don't have to reflect high entropy states of flows. I imagine flow disorder is likely to be less than relative abundance disorder if they were scaled to the same unit (not sure how to translate the units though), since there's probably going to be a primary producer om-nom-noming the sun.

    I read a bit of Salthe today. As I've said previously, I don't mind the project of trying to find a unification of entropy. Salthe has a nice article talking about four different notions of entropy, so far I can tell that he contributes a way of speaking about entropy using similar words in all the contexts. He also doesn't seem to like 'pragmatism', or what I think's better called 'instrumentalism' in the sciences. A focus on measurement and mathematisation. What I've got from reading his writing was the occasional 'I can see through time' moment regarding entropy.

    I understand that you need to be able to use the following terms to talk about anything in this vista of generalised entropy. 'degrees of freedom' 'energy' 'entropy' 'information' 'order' 'disorder' 'dynamics' 'entropic gradient' 'exergy' 'boundary condition'. Salthe seems to use the words much as you do.

    The way he uses 'boundary condition' as a metaphor is probably the strongest concept I've seen from him so far, at least insofar as a generalised parametric approach to entropy goes. It gave me a picture like this:

    Dynamical system 1 outputs a set,
    Dynamical system 2 takes that set as literal boundary conditions.

    I never said I couldn't see the meaning behind the handwaving. This is roughly consistent with the poetic notions of degree of freedom you and Salthe espouse - the boundary condition allows for the evaluation of specific trajectories, and is so a 'constraint' and thus a loss of 'freedom'.

    Maybe I'd be more comfortable with what you're saying if you used scarequotes like I do.
  • Streetlight
    9.1k
    Organisms lay
    Jumbled up yet striving for
    Their own mouths to feed

    Each cell binds a gap
    Consuming a gradient
    Where to find new food?

    Digging in the weeds
    An old domain of feasting
    For new specialists

    Symmetry broken
    Entropy from exergy
    Degrees of freedom
    fdrake

    Oh an Apo post!

    I love this so much though.
  • apokrisis
    7.3k
    I'm not suggesting you 'look at the formulas and find a master one', the thing I cared about was that the measures of entropy in terms of ascendency and relative abundance meant different things - they summarise different aspects of the behaviour of the ecosystem.fdrake

    What else do you expect if you take the attitude that we are free to construct metrics which are valid in terms of our own particular interests?

    I agree we can do just that. We can describe the world in terms that pick up on some characteristic of interest. I just say that is not a deep approach. What we really want is to discover the patterns by which nature organises itself. And to do that, we need some notion about what nature actually desires.

    This is where thermodynamics comes in. This is what is unifying science at a foundational level now. Both biology and physics are reflecting that emergent metaphysical project. And thermodynamics itself is becoming semiotic in recognising the duality of entropy and information.

    So you are down among the weeds. I'm talking about the big picture. Again, it is fine if your own interests are narrow. But I've made different choices.
  • apokrisis
    7.3k
    Maybe I'd be more comfortable with what you're saying if you used scarequotes like I do.fdrake

    Your comfort is definitely my number one priority. I mean "number one priority".
  • apokrisis
    7.3k
    So a commonality is that they are mappings from some space to the real line. But what matters - what determines the meaning of the entropy is both what the inputs to the entropy function are and how they are combined to produce a number. To speak of entropy in general is to let the what and the how vary with the implicit context of the conversation; it destroys the meaning of individual entropies by attempting to unify them, the unification has poor construct validity precisely because it doesn't allow the what and the how of the mapping to influence the meaning.fdrake

    And yet something still ties all this variety back to some general intuition. The usual response is "disorder".

    As I have said, at the metaphysical level, we can only approach proper definitions by way of a dialectical or dichotomistic argument. We have to identify the two complementary extremes that are mutually exclusive and jointly exhaustive. So a metaphysical-strength discussion of entropy has to follow that form. It has to be entropy as "opposed to what?". Your concern is that there seem multiple ways to quantify "entropy". My response has been that a metaphysical-strength definition would be qualitative in this precise fashion. It would lead us to a suitable dichotomy.

    Hence why I keep trying to return the conversation to entropy~information as the candidate dichotomy that has emerged in recent times. It seems a stronger statement that entropy~negentropy, or disorder~order, as mere negation is a weak kind of dichotomy, not a strong one.

    Likewise constraints~degrees of freedom slice across the debate from another direction. How the two dichotomies of entropy~information and constraints~degrees of freedom might relate is a further important question.

    So a qualitative approach here isn't just a hand-waving, anything goes, exercise in speculation. Metaphysics does have a method for clarifying its ideas about reality. And that approach involves discovering a reciprocal relation that connects two opposed limits on Being.

    As I said, information and entropy capture a duality in terms of relative surprisingness. An entropy-maximising configuration is reciprocally the least-informational. No need to count microstates even if every one is different. You only need to measure a macrostate to completely characterise the aspect of the system that is of interest.

    By contrast, a surprising state of the world is the most informational. It is high on negentropy. And now just one "microstate" is the feature that appears to characterise the whole situation - to the degree that is the aspect of interest.

    So "disorder" is just an application of the principle of indifference. A messy system is one in which the details don't matter. Gone to equilibrium, the system will be generic or typical.

    However no system can in fact maximise its entropy, reach equilibrium, except that it has stable boundary conditions. So - dichotomously - there is negentropy or order in the fact of boundary stability, in the fact of being closed for causality.

    Thus is it common to sum up entropy as about a statistical propensity to be disordered - to be in a system's most typical state. But that "first law of thermodynamics" state of closure (well, it includes the necessity of the third law to put a lower bound on things to match the first's upper bound) is only half the story. Somewhere along the line, the system had to be globally parameterised. The general holonomic constraints or boundary conditions had to form somehow.

    So here we see a good example of why dichotomies or symmetry-breakings are basic to metaphysical-strength analysis. They alert us to the other half of the holistic story that reductionists are wont to overlook. A full story of entropy can't just presume stable boundary conditions. Those too must be part of what develops (along with the degrees of freedom that global constraints produce, or parameterise).

    To the degree your discussions here are not dialectically explicit, they simply fail the test of being adequately metaphysical. Your obsession about quantitative methods is blinding you to the qualitative discussion where the hunt is on for the right way to frame information~entropy, or degrees of freedom~constraints, as the fundamental dichotomy of a developmental Cosmos.
  • apokrisis
    7.3k
    I love this so much though.StreetlightX

    I see you jiggling with joy on the sidelines. SX and his man-crushes.
  • Streetlight
    9.1k
    I see you jiggling with joy on the sidelines. SX and his man-crushesapokrisis

    Of course! Fdrake is among the best posters on the forum, and he most definitely has a cheerleader in me, especially in his apt charactrizsation of your posts - a loosely bound collection of regurgitated, empty buzzwords.
  • apokrisis
    7.3k
    It would be interesting if Shannon biodiversity was related to ascendencyfdrake

    This is the bit that puzzles me. It seems that all your arguments want to circle back to this species diversity index. But that is an utter triviality. It says nothing about the flow of energy through a biological system, nothing about the negentropic response of a system being pushed away from its equilibrium state, nothing about anything which has to do with the global dynamics or the fluxes that are what create ecologies in the first place.

    It is pretty clear that I’m talking about dissipative structure. And so is ascendency. So is Salthe, Kay and Schneider, Bejan and the many other cites I offered. But your critique amounts to me failing to relate dissipative structure theory to some mundane measure of species diversity.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.