• Mersi
    22
    I thought the aime was to put together a meaningful definition of the term "information" (if possible?)

    I assume that the aime should be to define the term as far as possible in such a way that it covers the vague meaning that the word "Information" has in everyday`s language. We think, we know what is meant by statements like: I need more information.
    Regardless wether there is a monism of energy, matter and Information or not, it makes no sense to call everything information we may be able to find out about a given object. We may call this complexity or even potential information.
    In the same way it makes no sense to refer to everything we already know as information. There are already terms for this: Let´s call it memory or knowledge.
    The term "Information" has two aspects, that have not been considered enough so far.
    1) The aspect of novelty: A propostion only contains information for us, if we draw new conclusions from it. This is to seperate the term "Information" from the Term "knowledge". But at the same time it does not prevent us from calling "Information" what arises in the moment we become aware of a new idea.
    2)The aspect of comprehensibility: A proposition contains information for us, only if we are able to understand it. To do so, it´s semantic elements (what ever this is) must match a part of what we know about the world (or let´s say it´s hypothetical linguistic expression).

    I think, there is no sense in trying to determine the amount of potential information contained in a given object in advance. Because the amount of "Information" we may draw from dealing with an object (or a proposition) depends on the way we look at it. Let´s remeber the saying: For the one who´s only tool is a hammer, everything looks like a nail.

    PS: During the interwar period there was an attempt by members of the "Wiener Kreis" to quantify the semantic content of a message by means of the complexity of it´s syntactical structure. This attempt failed as did theire attempt to find only one of Wittgenstein´s elementary propositions.
  • Gobuddygo
    28
    It's that easy. We have the physics interpretation. Information= entropy=klnN (a number).

    Then we have the information (non-quantifiable) contained in forms. There is a huge variety of them. They are contained in the physical world and in the world of the brain, interacting via our body and they are interdependent. The depend on the initial state of the universe.
  • Mark Nyquist
    774

    I was looking for math models of information and found this push me pull you game on Wikipedia.

    Push Me Pull You game

    If you play the videos it might give you some insights into how math models could be used to simulate brain function. I'm not saying this example is how brain function works, just that this type of simulation could be useful in studying how neuron groups control information.
  • Pop
    1.5k
    It's that easy. We have the physics interpretation. Information= entropy=klnN (a number).Gobuddygo

    :grimace: Can you please explain the logic that underlies this expression?

    Then we have the information (non-quantifiable) contained in forms. There is a huge variety of them. They are contained in the physical world and in the world of the brain, interacting via our body and they are interdependent. The depend on the initial state of the universe.Gobuddygo

    Yes, here we are talking about nonequilibrium, irreversible, dissipative systems, which virtually all natural systems are. So whilst entropy plays a role in their creation of ordered form, it is nothing like the above mathematical expression suggests. The interaction of systems can be reduced to the interaction of one part to another. This coincides with the reduction of logic to one part to another, which is the basis of our relational understanding. This interaction is information, and nothing exists outside of this interaction, "this interaction is everything" - everything is probabilistic outside of this interaction, both forward and backward in time. At the point of interaction, the probability is collapsed to a moment of consciousness - and "nothing" exists outside these moments. The two parts interacting are a brain state of integrated information, interacting with and trying to integrate a brain state representing an externality. Their interaction is largely deterministic, but there is a slight element of randomness, swaying the determinism such as to allow for emergence.

    Some of this randomness would be entropy playing its part. Cellular motors are 66% efficient - can convert 66% of energy to ordered form, so 34% is lost as heat and entropy. Seems like a lot of entropy in the mix of what creates order, but it is an open environment so much of this entropy would be dissipated. And a small amount would remain in the mix, perhaps 1% or less, causing a randomness of consciousness, such as to allow for novel form?
    What do you think?
  • DanLager
    25
    Can you please explain the logic that underlies this expression?Pop

    The k is Bolzmann's constant. The N is the number of physical states that leaves the system unchanged. The ln is the Natural logarithm.

    As you can read in my second part this information=entropy= a number (klnN). This is not the information contained in the forms, which is not quantifiable.
  • DanLager
    25
    This interaction is information, and nothing exists outside of this interactionPop

    Information is not interaction (the non-quantifiable info of the forms, that is).9
  • Pop
    1.5k
    it makes no sense to call everything informationMersi

    The obvious answer is - show me something that is not information?

    It took me several months, perhaps six months to warm to the idea that "everything" is information, so I wouldn’t expect you to be able to understand it straight away. I have amassed quite a lot of evidence for it, from diverse and respected opinion in science, physics, biology, philosophy, anthropology, etc, and I have weighed it up against the evidence that not everything is information, which is non existent – it is an assumption without proof. - a belief! So I have weighed up the evidence and decided to follow the logic.

    The resultant definition seems to fit. Information is the interaction of form. But probably I have not explained myself well enough. If everything is information, then there is only one thing information can be – it is the interaction of systems. Let me prove it to you by explaining how it works in the situations that you have posted. If you don’t mind, I will correct them in terms of my model. My model is not quite complete, and it would take too long to explain in detail, so I will plow straight into prediction, and hopefully you will understand.

    The aspect of novelty: A propostion only contains information for us, if we draw new conclusions from it. This is to seperate the term "Information" from the Term "knowledge". But at the same time it does not prevent us from calling "Information" what arises in the moment we become aware of a new idea.Mersi

    According to my model, information only exists at the point of interaction of two systems. Mine is a panpsychic definition, applicable in all situations. But for humanity this interaction is a neural process of distinguishing an externality against the integrated information already established ( knowledge ) as per constructivism.

    The aspect of comprehensibility: A proposition contains information for us, only if we are able to understand it. To do so, it´s semantic elements (what ever this is) must match a part of what we know about the world (or let´s say it´s hypothetical linguistic expression).Mersi

    Yes , there needs to exist an established body of information ( knowledge ) onto which new information must fit, in order to understand it........Can you see how knowledge is a body of information - accumulating this way through moments of consciousness? Read my reply to Gobuddygo above, and Danlager below for more detail.

    I think, there is no sense in trying to determine the amount of potential information contained in a given object in advance. Because the amount of "Information" we may draw from dealing with an object (or a proposition) depends on the way we look at it.Mersi

    Agreed, the information outside of a moment of interaction, that creates consciousness, is probabilistic. Schrodinger's cat situation proves it.

    During the interwar period there was an attempt by members of the "Wiener Kreis" to quantify the semantic content of a message by means of the complexity of it´s syntactical structure. This attempt failed as did theire attempt to find only one of Wittgenstein´s elementary propositions.Mersi

    In my model, a body of integrated information arises due to the personal experiences of an individual over a life time, so is idiosyncratic. Understanding is idiosyncratic to a particular consciousness, so in order to influence any particular consciousness, one needs to tailor the information specifically to their consciousness.

    Wikipedia - "Today in the United States we have somewhere close to four or five thousand data points on every individual ... So we model the personality of every adult across the United States, some 230 million people".

    — Alexander Nix, chief executive of Cambridge Analytica, October 2016
  • DanLager
    25
    The obvious answer is - show me something that is not information?Pop

    All things in formation constitute information. Thing not in formation don't.

    Interaction is no information too. It merely takes care of information.
  • Pop
    1.5k
    As you can read in my second part this information=entropy= a number (klnN). This is not the information contained in the forms, which is not quantifiableDanLager

    Yes I understand. But I find such equations frustrating, as no information can be retrieved from entropy which is chaotic. The information retrieved from the Boltzman situation is purely theoretic and nothing to do with the observation of any particular state. It is to do with applying a logic to a theoretical state in thermal equilibrium.
  • Pop
    1.5k
    All things in formation constitute information. Thing not in formation don't.DanLager

    That information = entropy is very misleading, imo. But I understand what you are saying, thanks.
  • DanLager
    25
    Yes I understand. But I find such equations frustrating, as no information can be retrieved from entropy which is chaotic.Pop

    It's just a number. True info lies in the forms. Two books contain the same entropy but very different info. If one is a love story and the other a particle physics roman.
  • Pop
    1.5k
    What do you think of my description aside from the equation. The comment below the question. I'm not too sure about this, but something of the sort would need to occur?
  • DanLager
    25
    What do you think of my description aside from the equation. The comment below the question. I'm not too sure about this, but something of the sort would need to occur?Pop

    I think it's very abstract but I can relate to it. Can you give a practical example? I don't see interaction is information instead of an intermediary between different forms (litterally, forms, like a circle or square(. The neural forms and forms in the ohysical world interact. My body lies between these worlds.
  • DanLager
    25


    You should write an essay on this topic! I see it interests you!
  • DanLager
    25
    The proposed definition of "Information" is : the evolutionary interaction of formPop



    Not the evolution of forms?
  • Pop
    1.5k
    My body lies between these worlds.DanLager

    I take the Enactivist approach, which suggests a mind ( as we know it ) evolved after the symbology of externalities was resolved. So we interact with the symbolized world presented to us by our senses and a primitive mind. This primitive mind resolves the world neural network like, so no reasoning is involved. So information is interaction at all levels, but for us it is interaction of a sophisticated mind, with the primitive mind, which presents us with a picture of the world. In this scenario the subject object distinction ocurrs at the neural level.

    Yes I should write something in more detail, I have still not quite put it together, and there are situations I am not certain about, yet! :smile:
  • Pop
    1.5k
    Not the evolution of forms?DanLager

    Forms might work better.
  • DanLager
    25
    Yes I should write something in more detail, I have still not quite put it together, and there are situations I am not certain about, yet!Pop

    How do you envision the interaction?
  • DanLager
    25


    Do you mean with "more sophisticated" us people?
  • Pop
    1.5k
    Do you mean with "more sophisticated" us people?DanLager

    Yeah, the cerebellum would represent us, whilst a more primitive mind would have resolved the external world to a coloured in and symbolized world, during a time when the cerebellum was less developed.

    How do you envision the interaction?DanLager

    I use a systems logic. I find that logic is equal to informational structure found in the external world. I assume these two are equal, and this gives me confidence in logic. In any pocket of the universe that is ordered, the underlying self organization is causing this. So logic and this order can not be different. This gives me confidence in logic and mathematics as reliable descriptions of externalities.
  • DanLager
    25
    Yeah, the cerebellum would represent us, whilst a more primitive mind would have resolved the external world to a coloured in and symbolized world.Pop

    If it represnts us then who are "us" (we)? Personally I think we are just our body (without brain).

    I'm not on your side concerning logic as interaction. Logic connects forms but in a restricted way.

    My wife again: "get your ass from behind that phone!". Sigh...women. But its half past one already... ☺
  • Pop
    1.5k
    If it represnts us then who are "us" (we)? Personally I think we are just our body (without brain).DanLager

    :smile: Ha, we are an evolving process of self organization. Which to my mind is equal to consciousness. It is not I think therefore I am, but I am consciousness, in an ongoing and evolving processes.

    I'm not on your side concerning logic as interaction. Logic connects forms but in a restricted way.DanLager

    If order is informational structure, then it would only have one way to present itself. The order within the biosphere is interrelated - interacting all the time creating a whole - it only has one logic, imo.
  • Pop
    1.5k
    All that exists for us is moments of consciousness. This is the interaction I'm trying to get at. Information is the change experienced in these moments of consciousness. Life and knowledge is a progressive accumulation of these moments. And there is nothing outside these moments, everything is probabilistic outside of these moments, both forwards and backwards ( memory ) in time, in the absolute sense.

    To some extent, we collapse probability to conception in a moment of consciousness, due to the interaction of externalities, real or imagined, - this change causing process is information.
  • Pop
    1.5k
    If you play the videos it might give you some insights into how math models could be used to simulate brain function. I'm not saying this example is how brain function works, just that this type of simulation could be useful in studying how neuron groups control information.Mark Nyquist

    I looks very similar. :smile:



  • Mark Nyquist
    774

    Actually what I linked to was a ridiculous example and it wasn't a simulation but recorded game play. Still something about the Push Me Pull You game reminded me of brain activity so maybe you saw it too.
    There are better brains simulator videos on the web but they are really boring. They will put you to sleep.
    Your examples are more fact based and certianly relevant to what information is.
    I was also thinking how our brains handle tens of thousands of items of information per day but everything seems to happen on a single stage, almost one at a time. Like there is a central core to how we handle information with a lot of peripherals filling in the details.
  • Pop
    1.5k
    I was also thinking how our brains handle tens of thousands of items of information per day but everything seems to happen on a single stage, almost one at a time. Like there is a central core to how we handle information with a lot of peripherals filling in the details.Mark Nyquist

    Yes, there is only ever a moment of consciousness - the stage. Life can be broken down to progressive moments of consciousness lasting 1-400ms, as we have discussed before. Life is a progression and accumulation of these moments. Whether these moments are like the frames of a movie reel or something inching forward hand over hand so to speak, like push me pull you - who knows?

    It amazes me how simple a conception it all can be reduced to.

    I have found a definition of information very similar to my own, but put to different purposes.
  • jorndoe
    3.6k
    Don't know much about it, but apparently Landauer's principle puts forth a relation between information and thermodynamics, i.e. quantification.
    Not sure this covers the different uses of the term "information", though. Or how solid the relation is.
  • Wayfarer
    22.5k
    I started a thread on that some time back. Landauer is in the ‘information science’ business, he was a senior scientist at IBM. So he could tell you in very precise terms how many bits of data the Complete Works of Plato would require, and the energy requirements of storing it or erasing it. But he wouldn’t necessarily have anything to say about its content.
  • Pop
    1.5k
    Thanks for that, I hear it mentioned a lot, but I don't understand the math. Perhaps somebody can explain?

    Landauer's principle
    It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment". - Wiki.

    This is something that may be relevant to consciousness. In the process of information, as an interaction of one part to another, there is an element of entropy which changes the deterministic nature of the relation, such that a tiny degree of randomness arises - perhaps this is what causes emergence?
    Could Landauer's principle explain it?

    Perhaps this is free will? :smile:
  • TenderBar
    18
    I started a thread on that some time back. Landauer is in the ‘information science’ business, he was a senior scientist at IBM. So he could tell you in very precise terms how many bits of data the Complete Works of Plato would require, and the energy requirements of storing it or erasing it. But he wouldn’t necessarily have anything to say about its content.Wayfarer

    The information contained in a book about a love adventure is about the same as that in a book on quantum fields in curved spacetime. If both books contain about the same number of letters. This equality can be expressed in an equality of numbers. Entropy of book one is the same as entropy of book two. If we look at the letters only (the entropy of the physical books not included). Both books are entirely different though. An alien wouldn't (yet) be able to discern between books.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.