What are your thoughts, queries, arguments, definitions, and insights? It would be great to have a general understanding of information on this forum. — Pop
That would be how information integrates in mind. What makes information integrate generally is the big question? — Pop
What are your thoughts, queries, arguments, definitions, and insights? It would be great to have a general understanding of information on this forum. — Pop
Molecular biology is based on two great discoveries: the first is that genes carry hereditary information in the form of linear sequences of nucleotides; the second is that in protein synthesis a sequence of nucleotides is translated into a sequence of amino acids, a process that amounts to a transfer of information from genes to proteins. These discoveries have shown that the information of genes and proteins is the specific linear order of their sequences. This is a clear definition of information and there is no doubt that it reflects an experimental reality.
Otherwise, saying 'everything is information' means very little in my opinion. Something which explains everything, explains nothing, because it's too general to be meaningful. And 'everything is information' fails for that reason. — Wayfarer
At least matter or matter-energy can be defined within a range by physics. The term 'information' is polysemic, meaning it has many different definitions in different contexts. So saying that 'everything is information' is not a meaningful statement, in my view. — Wayfarer
Consciousness has an ability to grey out certain information, to focus on specific information — Pop
But the question arises, what is the faculty that is performing that? There seems to be an ordering principle at work. And I don't know if that faculty can be understood in terms of 'information' or whether it exists on another level altogether. — Wayfarer
But it's still a leap from there to the claim that 'everything is information' in a metaphysical sense, as if in itself this idea comprises a grand philosophical synthesis. It's part of the picture, but not the whole picture. — Wayfarer
Could you simplify that some? — frank
Our physical Senses are able to detect Information (meaning) only in its "entangled" or embodied physical form. But human Reason is able to detect Information in metaphysical (disembodied) form (ideas; meanings). Like Energy, Information is always on the move, transforming from one form to another. Likewise, Energy is only detectable by our senses when it is in the form of Matter. For example, Light (photons; EM field) is invisible until it is transformed into some physical substance, such as the visual purple in the eye.that the information "always" exists entangled in a substance, and so this leads to a monistic understanding. — Pop
Do you mean processes? Or combines/mixes? Some dynamic process?Consciousness integrates information. — Pop
This didn't help. Do you mean information can be focused?integrated to a point — Pop
At least matter or matter-energy can be defined within a range by physics. The term 'information' is polysemic, meaning it has many different definitions in different contexts. So saying that 'everything is information' is not a meaningful statement, in my view. — Wayfarer
Yeah, everything is a self organizing system. And the thing being organized is information. — Pop
So Shannon's big step to a foundational model of information was to make the basic dichotomous distinction between signal and noise. Given some collection of material events - like crackles on a telephone line - what would we characterise as noise just because it was perfectly random as a pattern, and what would we characterise as signal because it was so clearly deliberate and intentional in its structure? — apokrisis
So the natural sciences might seem to be confused about "information" as a physicalist notion — apokrisis
Information is merely relations between physical entities viewed from our modeling perspective, a distinctly human formal causality. — Pop
"Information" is a reifying of all the observed causal interactions between a given set of existents, and lacks independence from matter — Pop
The whole of physics could be rebuilt on a metaphysics of marks - the smallest scale of definite events or countable degrees of freedom. It was a way to both recognise the underlying quantum nature of existence, and yet also apply the constraints of an emergent classical picture where reality is formed by a material capacity to ask the critical question of a spatiotemporal location - "are you a 1 or a 0?" — apokrisis
"It from bit" symbolises the idea that every item of the physical world has at bottom — at a very deep bottom, in most instances — an immaterial source and explanation; that what we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and this is a participatory universe.
Anton Zeilinger, Director of the Institue for Quantum Optics and Quantum Information, explains: "My interpretation [of "it from bit"] is that in order to define reality, one has to take into account the role of information: mainly the fact that whatever we do in science is based on information which we receive by whatever means."
But can we go one step further? Can we say reality is information, that they are one and the same? Zeilinger thinks not: "No, we [need] both concepts. But the distinction between the two is very difficult on a rigorous basis, and maybe that tells us something." Instead, we need to think of reality and information together, with one influencing the other, both remaining consistent with each other.
Perhaps the answer will follow Einstein's great insight from one century ago when he showed that you can't make a distinction between space and time, instead they are instances of a broader concept: spacetime. In a similar way, perhaps we need a new concept that encompasses both reality and information, rather than focusing on distinguishing between them.
Enter Shannon. — frank
Claude Shannon introduced the very general concept of information entropy, used in information theory, in 1948. Initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and the earlier work in thermodynamics, but the mathematician John von Neumann certainly was. "You should call it entropy, for two reasons," von Neumann told him. "In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."
And why does every 'thing' need to irreducibly contain information? — Mark Nyquist
Right. And Claude Shannon was an electronics engineer, dealing with a specific form of information, namely, information being transmitted across media. i don't know if he was at all concerned with 'information' as an abstract concept was he? — Wayfarer
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.