The total count of books has become relevant, their content less so. — Isaac
Now, imagine that the book is the Bible. Think about how much conflicting information has been interpreted from the Bible. The amount of (potential) information is exacerbated by our ignorance as to the origin of the Bible, and what many of the passages mean, or what the authors intended - the information they intended to convey (the actual information) vs our interpretation of what they intended to convey (potential information).When I were a lad, there was none of this internet thing, we used to get all our information from books. The little local library would have a few hundred books, big university libraries thousands. Thing was, the books were nearly all different. There might be more than one copy of the most used books, but the amount of information available was counted by the number of different books. A library with a thousand copies of only one book would not be a repository of much information, in fact it would only have one book's worth. Is this controversial? — unenlightened
I was talking about the written information, not the logistics of the library. — unenlightened
Quantity of information depends on the question — Isaac
Quantity of information depends on the question
— Isaac
Bizarre! I cannot continue this discussion, because I have no idea what you are referring to. Where is the uncertainty in the content of a text? What is the question and how does it determine the content of a text? Alas, I do not want answers to these questions, I merely lay them before you as tokens of bafflement. I do not think you can reduce my uncertainty. — unenlightened
I'd call them undeciphered texts — Wayfarer
I'm genuinely puzzled, and couldn't find the source, so, grateful for more if you have it. — bongo fury
All signs, symbols, and codes, all languages including formal mathematics are embodied as material physical structures and therefore must obey all the inexorable laws of physics. At the same time, the symbol vehicles like the bases in DNA, voltages representing bits in a computer, the text on this page, and the neuron firings in the brain do not appear to be limited by, or clearly related to, the very laws they must obey. Even the mathematical symbols that express these inexorable physical laws seem to be entirely free of these same laws. — Howard Pattee
Cipher: a secret or disguised way of writing; a code.
"he wrote cryptic notes in a cipher"
The 'book of nature' is Galileo's metaphorical description of the manner in which mathematics can be used to interpret natural regularities. — Wayfarer
The fact that they exist on a different plane to physical laws and relationships can be demonstrated by the fact that the same information can be represented in numerous media and languages. — Wayfarer
(This is why his first book, Consciousness Explained, was almost immediately dubbed Consciousness Ignored.) — Wayfarer
I have been so impressed with the notion that Information is the "fundamental constituent" of the world that I created a website to present my emerging worldview as a thesis. I called it Enformationism to distinguish it from the obsolete worldviews of spooky Spiritualism and mundane Materialism. In the light of 21st century science, those contradictory views are obsolete. Instead, the world seems to be, philosophically, a bit of both : Spiritualism (Meta-physics, Mind, Ideas) and Materialism (Physics, Matter, Atoms). I support my compatiblist view by noting that Information has been found in two real-world forms : malleable tangible Matter & creative intangible Energy, as expressed in the equation, E=MC^2, and two Meanings (polysemic) : Shannon's meaningless syntax, and Bergson's meaningful semantic “difference”. In my thesis, Energy is EnFormAction.But what bothers me a bit, is the introduction of 'information' as a metaphysical simple - as a fundamental constituent, in the sense that atoms were once thought to be. . . . So - I'm totally open to the notion that 'information is fundamental', but it seems to me to leave an awful lot of very large, open questions, about what 'information' is or means or where it originates. — Wayfarer
It's not incoherent, but it's a metaphor, and in the context of this discussion it obfuscates the subject. — Wayfarer
The reason why the scientific culture ignores metaphysical questions is because their is no way to falsify the answers to the questions, which is a fundamental part of the scientific method.And the way that our mainly scientific culture deals with metaphysical questions is to ignore them, bracket them out, or pretend they don't exist. Which is basically what Dennett does, and does so well that he is able to ignore the fact that he's ignoring it. (This is why his first book, Consciousness Explained, was almost immediately dubbed Consciousness Ignored.) — Wayfarer
Thanks. I agree with this phrase : "Causation can be understood as the transfer of information". That is what I call EnFormAction in my thesis.I would like to add John D. Collier's Information, Causation, and Computation and Causation is the Transfer of Information — Harry Hindu
Explain how you think it "obfuscates the subject". — Janus
The reason why the scientific culture ignores metaphysical questions is because their is no way to falsify the answers to the questions, which is a fundamental part of the scientific method. — Harry Hindu
If you say, well, everything is information - the space between every atomic particle, the composition of every object - then you're saying nothing meaningful. Someone already said that I'm sticking to a strict definition of 'information' - this is true. To define something is to say what it isn't - de-fine, delimit, mark out. So if you simply say 'well everything is information', it doesn't say anything, because it makes the term so broad as to be meaningless. — Wayfarer
Where is the uncertainty in the content of a text? — unenlightened
Entropy is a measure of the unpredictability of the state, or equivalently, of its average information content. To get an intuitive understanding of these terms, consider the example of a political poll. Usually, such polls happen because the outcome of the poll is not already known. In other words, the outcome of the poll is relatively unpredictable, and actually performing the poll and learning the results gives some new information; these are just different ways of saying that the a priori entropy of the poll results is large. Now, consider the case that the same poll is performed a second time shortly after the first poll. Since the result of the first poll is already known, the outcome of the second poll can be predicted well and the results should not contain much new information; in this case the a priori entropy of the second poll result is small relative to that of the first. — Wikipedia article
the context of a thread on this subject on this or some other forum. — Wayfarer
In some species of deer, if you cut a notch in the antlers, next year’s regrown antlers come complete with an ectopic branch (tine) at that same location. Where, one wonders, is the ‘notch information’ stored in the deer? Obviously not in the antlers, which drop off. In the head? How does a deer’s head know its antler has a notch half a metre away from it, and how do cells at the scalp store a map of the branching structure so as to note exactly where the notch was? Weird!
The engineering of information transfer is irrelevant to the question of the place that ‘information’ now occupies in speculative philosophy and biology. — Wayfarer
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.