• hypericin
    1.6k
    "What are numbers?"
    "What is information?"

    While I cannot answer these perennial philosophical questions, I had the idea that the answers to these questions are the same thing.

    Information, at least the information we might be familiar with in computers, certainly seems number-like. Claude Shannon famously defined the fundamental unit of information to be the bit, which is the answer to a "yes" or "no" question. A bit is the simplest possible number, that can only have the values 0 or 1. "Bit" is a contraction of "binary digit", and when concatenated together, bits can represent larger numbers. 4-digit numbers can range from 0 to 9999 (0 to 10^4 -1), and 4-digit binary numbers can range from 0 to 15 (0 to 2^4 - 1). Because an ordinary (base-10) digit has more information than a bit, 4 ordinary digits can represent more numbers than 4 bits.

    You might be aware that everything stored on computers is a series of bits. Every data file, and every program, are ultimately long series of bits. If a bit is a binary digit, then a series of bits is a binary number, a number with potentially millions, billions or more binary digits. But there is nothing special about binary digits. Every binary number has one and only one corresponding base-10 number, as every base-10 number has its unique corresponding binary number. In other words, they are two representations of the same thing: numbers.

    Every text document, every email, every mp3 and mp3 player, every web page and web browser, every video game and every operating system, are ultimately (enormous) numbers. For instance, the text of this post, before this bar |, is represented by a base 10 number with approximately 3850 digits. A mp3 file would have around 8 million base 10 digits. These huge numbers are sometimes thought to be the province of hobby mathematics, with no expression in the actual universe. In fact, we interact with them every day.

    Computers can be thought of as physical instantiations of math functions, which ceaselessly takes the computer's current state (a gargantuan number) and transform it into the next state/number.

    State(n+1) = Computer(state(n))

    In principle, nothing stops you from implementing a computer as an enormous look-up table, so that the internal logic would be something like

    if (state == 12345667890...)
    state = 1423456789...)
    else if (state == 1234567891...)
    state = 1324567891...
    ...

    and so on. In practice, of course only the simplest computers could ever be implemented this way.

    I think ultimately the difference between information and numbers is only pragmatic. If the number is large and information dense enough so that it is impractical to deal with it without breaking it up, we call it information. If it is small and manageable, it is a number. If information and numbers are the same, this might explain why math keeps popping up in nature. If the universe is both physical and informational, then it is both physical and numerical, and it is therefore to be expected that mathematical patterns show up in the universe.

    There is more to say, and I think there are problems with this idea, but what do you guys think?
  • Philosophim
    2.6k
    Sounds pretty good. I've always taken that numbers are the logical expression of a living beings ability to discretely experience. Experience is the totality of your senses, thoughts, feelings, etc. A discrete experience is your ability to focus on an aspect of it. A sense. A thought. A feeling.

    So you can say, that is a (one) blade of grass. Two blades are the concept of 1 and 1 together. And of course you can say, "That is a field of grass." "That is one piece of grass". And so on.

    Information is the compact storage of a discrete experience that may be a combination of many aspects, properties, feelings etc. "My dog" evokes a lot of combined information into a neat and reasonable package to think and communicate about. So yes, I like your thinking on this!
  • apokrisis
    7.3k
    I think ultimately the difference between information and numbers is only pragmatic.hypericin

    Both piggyback on the logic of counterfactuality. And that in turn leads to the deep difficulties folk have in believing in a continuum that can also be broken into discrete parts. Logic seems to fail right at the point where you seek its own origin.

    So a string of bits or the numberline exist in the happy world where we can just take this paradoxical division between the continuous and the discrete for granted. Continuums are constructible. Don't ask further questions. Get on with counting your numbers and bits.

    So yes, a 1D line can be understood as an infinite collection of 0D points. And then you can start mechanically imposing any concept of an ordering of the points on the line that takes your fancy. The syntax can become as complex and hierarchical as you like. An analog reality can be encoded in string of digital data points. A Turing machine with an infinite paper tape can in principle represent any more complicated state. Counterfactuality is the true atom of Being.

    Yet this happy conception may work pragmatically, however its deeper foundations remain suspect. So there is that to consider.
  • Wayfarer
    22.8k
    Both piggyback on the logic of counterfactuality.apokrisis

    Can you unpack that a bit? The meaning doesn't spring from the page, so to speak.

    There is more to say, and I think there are problems with this idea, but what do you guys think?hypericin

    A question that has long interested me, and one of the motivators for joining forums. I once had an epiphany along the lines that while phenomena are (1) composed of parts and (2) begin and end in time, that these attributes don't obtain to numbers, which are neither composed of parts nor begin or end in time (although I later realised that (1) only properly applies to prime numbers, but the point remains.) At the time I had this epiphany, the insight arose, 'so this is why ancient philosophy held arithmetic in high esteem. It was certain, immutable and apodictic.' These are attributes of a higher cognitive functionality, namely rational insight. Of course, I was to discover that this is Platonism 101, and I'm still drawn to the Platonist view of the matter. The philosophical point about it is that through rational thought we have insight into a kind of transcendental realm. As an SEP article puts it:

    Mathematical platonism has considerable philosophical significance. If the view is true, it will put great pressure on the physicalist idea that reality is exhausted by the physical. For platonism entails that reality extends far beyond the physical world and includes objects that aren’t part of the causal and spatiotemporal order studied by the physical sciences. Mathematical platonism, if true, will also put great pressure on many naturalistic theories of knowledge. For there is little doubt that we possess mathematical knowledge. The truth of mathematical platonism would therefore establish that we have knowledge of abstract (and thus causally inefficacious) objects. This would be an important discovery, which many naturalistic theories of knowledge would struggle to accommodate.SEP

    See also What is Math? Smithsonian Institute.

    As for Shannon's information theory, I think it tends to be somewhat over-interpreted. Shannon was an electronic engineer trying to solve a particular problem of reliable transmission of information. Of course one of the fundamental discoveries of cybernetics, we all rely on Shannon's work for data compression and tranmission every time we use these devices. But there's a lot of hype around information as a kind of fundamental ontological ground, kind of like the digital geist of the computer age.
  • apokrisis
    7.3k
    Can you unpack that a bit?Wayfarer

    It is easy to assume things are just what they are. But that depends on them being in fact not what they are not. That should be familiar to you from Deacon's notion of absentials if not from the three laws of thought.

    Counterfactuality secures the identity of things by being able to state what they are not. And that is what a bit represents. A and not not-A. A switch that could be off and thus can in fact be on.

    Numbers are places marked on a line. The assumption is that their value is ranked from small to large. So already the symmetry is geometrically broken in a particular fashion. But that asymmetry is secured algebraically by identity operations. Adding and multiplying. Adding zero or multiplying by 1 leaves a number unchanged. Adding or multiplying by any other value then does change things.

    So the counterfactuality is a little more obscure in the case of the numberline. The question is whether a value was transformed in a way that either did break its symmetry or didn't break its symmetry. The finger pointing at a position on the line either hopped somewhere else or remained in the same place.

    Information and numbers then have to move closer to each other as we seek to employ atomistic bits – little on/off switches – as representations of algebraic structures. To run arithmetic on a machine, the optimal way is break it all down into a binary information structure that can sit on a mechanical switching structure. Just plug into a socket and watch it run, all its little logic gates clacking away.

    So counterfactuality is the way we like to think as it extremitises things to the point of a digital/mechanical clarity. The simplicity of yes or no. All shades of grey excluded. Although you can then go back over the world and pick out as many shades of grey as you like in terms of specific mixtures of black and white. You can recover the greyness of any particular shade of grey to as many decimal places as you like. Or at least to whatever seems acceptable in terms of your computational architecture. The gradual shift from 8-bit precision to 64-bit was pricey.
  • apokrisis
    7.3k
    But there's a lot of hype around information as a kind of fundamental ontological ground, kind of like the digital geist of the computer age.Wayfarer

    So part of what I was pointing out is how information theory cashes out the counterfactuality of logic in actual logic gates and thus in terms of a pragmatic entropic payback.

    Numbers seem to live far away in the abstract realm of Platonia. Shannon information was how they could be brought back down to live among us on Earth.

    Algebra with real costs and thus the possibility of real profits.
  • SophistiCat
    2.2k
    "Information" is a vexed term, as it is used differently (and often vaguely) in different contexts. A crucial thing about Shannon's theory in particular, which is often lost when it is casually mentioned, as you do here, is that it is a theory of communication, in which bits are only one part of a system that also includes, at a minimum, the encoder, the channel and the decoder. Taken in isolation, numbers or bits cannot be identified with information in any meaningful way.
  • hypericin
    1.6k
    As for Shannon's information theory, I think it tends to be somewhat over-interpreted. Shannon was an electronic engineer trying to solve a particular problem of reliable transmission of information. Of course one of the fundamental discoveries of cybernetics, we all rely on Shannon's work for data compression and tranmission every time we use these devices. But there's a lot of hype around information as a kind of fundamental ontological ground, kind of like the digital geist of the computer age.Wayfarer

    I guess I've bought into the hype. For me, thinking about a piece of information, say a snippet of song, pass somehow unchanged through multiple wildly different physical media, such as sound waves, tape, CD, mp3, cable internet, wireless internet, streaming buffer, then back to sound waves as you finally hear it, led me to start conceiving of information and matter as being independent, and both as fundamental elements of the universe (maybe not unlike Aristotle's hylomorphism).

    "Information" is a vexed term, as it is used differently (and often vaguely) in different contexts. A crucial thing about Shannon's theory in particular, which is often lost when it is casually mentioned, as you do here, is that it is a theory of communication, in which bits are only one part of a system that also includes, at a minimum, the encoder, the channel and the decoder. Taken in isolation, numbers or bits cannot be identified with information in any meaningful way.SophistiCat

    I'm not sure. Suppose an archaeologist uncovers tablets on which are inscribed a lost language. What did the archaeologist discover? Seemingly, information that can no longer be decoded. Years later, the language was translated. Did the information spring into being? Or was it always there?
  • Wayfarer
    22.8k
    then back to sound waves as you finally hear it, led me to start conceiving of information and matter as being independent, and both as fundamental elements of the universe (maybe not unlike Aristotle's hylomorphism).hypericin

    That, I agree with :100: and have often argued along these lines (see this thread).
  • 180 Proof
    15.4k
    I think ultimately the difference between information and numbers is only pragmatic.hypericin
    Afaik, it's "the difference" between pattern-strings and mathematical structures, respectively, such that the latter is an instance of the former. They are formal abstractions which are physically possible to instantiate by degrees – within tractable limits – in physical things / facts and usually according to various, specified ("pragmatic") uses. I think 'Platonizing' information and/or numbers (as 'concept realists', 'hylomorphists, and 'logical idealists' do) is, at best, fallaciously reifying.
  • Tarskian
    658
    There is more to say, and I think there are problems with this idea, but what do you guys think?hypericin

    No problem for visual and auditory information. We can include written language as a specialized subset of auditory information. Practical problems abound, however, with information related to smell, touch, and taste.

    Digital scent is considered experimental and quite impractical. There may actually not even be that much demand for it outside specialized application niches:

    https://en.wikipedia.org/wiki/Digital_scent_technology

    Digital scent technology (or olfactory technology) is the engineering discipline dealing with olfactory representation. It is a technology to sense, transmit and receive scent-enabled digital media (such as motion pictures, video games, virtual reality, extended reality, web pages, and music). The sensing part of this technology works by using olfactometers and electronic noses.

    Current challenges. Current obstacles of mainstream adoption include the timing and distribution of scents, a fundamental understanding of human olfactory perception, the health dangers of synthetic scents, and other hurdles.

    Digitizing taste is experimental only:

    https://en.wikipedia.org/wiki/Gustatory_technology

    Virtual taste refers to a taste experience generated by a digital taste simulator. Electrodes are used to simulate the taste and feel of real food in the mouth.[1] In 2012, Dr. Nimesha Ranasinghe and a team of researchers at the National University of Singapore developed the digital lollipop, an electronic device capable of transmitting four major taste sensations (salty, sour, sweet and bitter) to the tongue.

    Digitizing touch is also highly experimental research with currently no practical applications so to speak of:

    https://contextualrobotics.ucsd.edu/seminars/digitizing-touch-sense-unveiling-perceptual-essence-tactile-textures

    Imagine you could feel your pet's fur on a Zoom call, the fabric of the clothes you are considering purchasing online, or tissues in medical images. We are all familiar with the impact of digitization of audio and visual information in our daily lives - every time we take videos or pictures on our phones. Yet, there is no such equivalent for our sense of touch. This talk will encompass my scientific efforts in digitizing naturalistic tactile information for the last decade. I will explain the methodologies and interfaces we have been developing with my team and collaborators for capturing, encoding, and recreating the perceptually salient features of tactile textures for active bare-finger interactions. I will also discuss current challenges, future research paths, and potential applications in tactile digitization.
  • apokrisis
    7.3k
    'Platonizing' information and/or numbers (as 'concept realists', 'hylomorphists, and 'logical idealists' do) is, at best, fallaciously reifying.180 Proof

    Yet the holographic principle in fundamental physics says it means something that the same formalism works for information and entropy. At the Planck scale, the physical distinction between the discrete and the continuous dissolves into its own identity operation.

    There is something deeper as is now being explored. Reality is bound by finitude. Which would be a big change in thinking.
  • Wayfarer
    22.8k
    I've tried to explain recently why I think it's fallacious to say that Platonism in mathematics is a reification, meaning literally 'making into a thing'. Numbers and 'forms' in the Platonist sense can be thought of as being real insofar as they can be grasped by reason, but it's not because they exist in the sense that objects exist.

    Forms are ideas, not in the sense of concepts or abstractions, but in that they are realities apprehended by thought rather than by sense. They are thus ‘separate’ in that they are not additional members of the world of sensible things, but are known by a different mode of awareness. — Perl, Thinking Being

    that different mode being rational insight rather than sensory perception.

    Yet the holographic principle in fundamental physics says it means something that the same formalism works for information and entropyapokrisis

    Wasn't that because Von Neumann, who was an associate of Claude Shannon, suggested to him that he adopt the term 'entropy', noticing that it was isometric with Bolzmann's statistical mechanics interpretation of entropy? He also said that 'as nobody really knows what it means then you will always have an advantage in debates'. (Ain't that the truth ;-) )
  • 180 Proof
    15.4k
    Reality is bound by finitude.apokrisis
    I don't grok this. :chin::

    real insofar as they can be grasped by reasonWayfarer
    Semantic quibble: ideal, not "real".
  • apokrisis
    7.3k
    Wasn't that because Von Neumann, who was an associate of Claude Shannon, suggested to him that he adopt the term 'entropy', noticing that it was isometric with Bolzmann's statistical mechanics interpretation of entropy?Wayfarer

    Yep. That’s where it started. With a conceptual similarity. But then an actual physical connection got made. Folk like Szilárd, Brillouin, Landauer and Bekenstein showed that the Boltzmann constant k that sets the fundamental scale for entropy production also sets a fundamental scale for information processing.

    Computing creates heat. And so there is an irreducible limit to how much information a volume of space can contain without melting it. Or in fact gravitationally curling it up into a blackhole.

    In a number of such ways, information and entropy have become two faces of the same coin connected by k, which in turn reduces to c, G and h as the fundamental constants of nature. Reality has a finite grain of resolution. And information and entropy become two ways of talking about the same fact.

    I don't grok this.180 Proof

    I’m talking about holography and horizons. Volumes of spacetime can only contain finite amounts of information because information has a finite grain. Or in entropy terms, a finite number of degrees of freedom.

    So at the Heat Death, it all just stops. The information represented by the energy density of the Big Bang has reached its eternalised de Sitter state of being cooled and spread out as far as it could ever go. The Universe is a bath of blackbody radiation. But with a temperature in quantum touching distance of absolute zero. Photons with a wavelength the size of the visible universe.

    (There are a few issues of course. Like how to account for the dark energy that ensures this de Sitter state where the cosmic event horizon does freeze over at this finite maximum extent. So it is a sketch of the work in progress. Lineweaver is a good source.)
  • SophistiCat
    2.2k
    I'm not sure. Suppose an archaeologist uncovers tablets on which are inscribed a lost language. What did the archaeologist discover? Seemingly, information that can no longer be decoded. Years later, the language was translated. Did the information spring into being? Or was it always there?hypericin

    Exactly, how is it that the same marks on dry clay can carry more or less information in different contexts? And note that it's not just any marks that transmit information. Some random indentations and scratches on the same tablet would not do. How could that be if marks themselves were information?

    Also, note that in your example you used clay tablets, not numbers (and in your OP you went back and forth between numbers and computers, which, of course, are not the same thing). This shows that there isn't a necessary connection between information and numbers. Numbers or bits can serve as an abstract representation of an encoded message.
  • hypericin
    1.6k
    Exactly, how is it that the same marks on dry clay can carry more or less information in different contexts?SophistiCat

    I think they can't. They carry the same information, whether or not it happens to be decodable at the time.

    And note that it's not just any marks that transmit information. Some random indentations and scratches on the same tablet would not do. How could that be if marks themselves were information?SophistiCat

    Why does that preclude the marks themselves being information? Marks, arranged in certain ways, are information. Arranged randomly, they are not.

    and in your OP you went back and forth between numbers and computers, which, of course, are not the same thingSophistiCat

    The point was that computers, thought of as information processing devices, are just as much number processing devices.

    Numbers or bits can serve as an abstract representation of an encoded message.SophistiCat

    Why not
    Numbers or bits can serve as an abstract representation of an encoded message
  • Wayfarer
    22.8k
    information and entropy become two ways of talking about the same fact.apokrisis

    I see the connection you're drawing between entropy and information at the physical level, where both are linked by thermodynamic principles through the Boltzmann constant (after a bit of reading!) However, I wonder if equating the two risks losing sight of the fact that information derives its significance from its order. For instance, a random string of letters might technically have entropy, but it lacks the kind of structured information we get from an ordered string of words. Even so, you could transmit random strings of characters and measure the degree of variance (or entropy) at the receiving end, but regardless no information would have been either transmitted or lost. It's this order that makes information meaningful and, importantly, it’s the order that gets degraded in transmission. So I question that equivalence - might it not be a fallacy of equivocation?
  • apokrisis
    7.3k
    However, I wonder if equating the two risks losing sight of the fact that information derives its significance from its order.Wayfarer

    Well if you are interested in processing signals, that requires you to have a model of noise. This was Shannon's actual job given he worked with a phone company with crackling long distance wires. You need to know how often to repeat yourself when it is costing you x dollars a minute. Maybe send a telegram instead.

    If you know the worse case scenario – no information getting through to the other end, just random crackle – then you can begin to measure the opposite of that. How much redundancy you can squeeze out of a message and yet rely on it to arrive with all its meaning intact.

    So sure. We can load up our bit strings with our precious cargo of "differences that make a difference". We can send them out across the stormy seas of noisy indifference. But if we know that some of the crockery is going to be inevitably broken or nicked, we might choose to pack a few extra cups and plates to be certain.

    Information theory just starts with a theory of noise or disorder. Then you can start designing your meaning transmission networks accordingly.

    So I question that equivalence - might it not be a fallacy of equivocation?Wayfarer

    It is like when folk talk about big bangs or dark matter. People latch onto complex scientific ideas in simplistic fashion.

    To actually have a flow of meaning – which is what you are meaning by "information" – it has to be a flow of differences that make a difference. However that in turn requires a physical channel that transmits difference in the first place. Like a vocal tract that can make noises. Or a stylus that can scratch marks on wax.

    So you start with the mechanics for making a noisy and meaningless racket. Then you can start to add the constraints that suppress the noise and thus enhance the signal. You add the structure that is the grammar or syntax that protects your precious cargo of meaning.

    A number line has its direction. You are either counting up or counting down. A bit string has some standard word size that fits the bit architecture of the hardware. As further levels of syntax get added, the computer can figure out whether you are talking about an operation or its data.

    So no equivocation. Information theory is the arrival of mechanical precision. Mathematical strength action.

    Entropy likewise brings mathematical precision to our woolly everyday notions about randomness or chaos. It gives a physical ground to statistical mechanics. Boltzmann's formula speaks to the idea of noise in terms of joules per degree Kelvin.
  • Wayfarer
    22.8k
    I don't think you've seen the point of the objection. The word 'information' is often used in this context, but 'order' and 'meaning' are both much broader in meaning that 'information'. The reason that measures of entropification can be applied to information, is just because information represents a form of order, and entropy naturally applies to order. No wonder that Von Neumann spotted an isomorphism between Shannon's methods and entropy.

    Besides, I have a suspicion that the designation of 'information' as being foundational to existence, goes back to Norbert Wiener saying 'Information is information, not matter or energy. No materialism which does not admit this can survive at the present day.' I'm sure this is what leads to the prevalence of information-as-foundation in contemporary discourse.
  • SophistiCat
    2.2k
    Information crucially depends on the sender and the receiver (and noise, if any) - this is what is being neglected here. Divining from patterns of tea leaves or decoding random marks on clay gives you no information, because no information was sent in the first place, despite there being a message. Similarly, numbers in themselves are not information, because they do not encode any message - they are just there.

    The message "The cat is on the mat. The cat is on the mat." gives you no more information than the message "The cat is on the mat." even though the former contains more bits than the latter (I am discounting noise for simplicity). The message "Your name is X" gives you no information if your name really is X and you are not suffering from amnesia. So, information depends on the receiver as well.

    Numbers can be used in mathematical modeling of communication, but numbers in themselves are no more information than they are novels or bridges or population clusters.
  • apokrisis
    7.3k
    The reason that measures of entropification can be applied to information, is just because information represents a form of order, and entropy naturally applies to order.Wayfarer

    How does entropy apply naturally to order except as its negation or inverse? Just as is the relation between signal and noise in the information perspective.

    I'm sure this is what leads to the prevalence of information-as-foundation in contemporary discourse.Wayfarer

    Did you read the full passage? Do you really now want to commit to the thesis that brains are just mechanical computers? Racks of logic switches? Thought reduces to symbol processing?
  • hypericin
    1.6k
    Information crucially depends on the sender and the receiver (and noise, if any) - this is what is being neglected here. Divining from patterns of tea leaves or decoding random marks on clay gives you no information, because no information was sent in the first place, despite there being a message. Similarly, numbers in themselves are not information, because they do not encode any message - they are just there.SophistiCat

    This seems like a conflation of information and communication, where communication is the transmission of information from A to B. Tea leaves and random marks are rich with information, it would take a lot of work to accurately represent their states. But they are not communications, which are intentional acts.

    The message "The cat is on the mat. The cat is on the mat." gives you no more information than the message "The cat is on the mat." even though the former contains more bits than the latter (I am discounting noise for simplicity). The message "Your name is X" gives you no information if your name really is X and you are not suffering from amnesia. So, information depends on the receiver as well.SophistiCat

    Information is not the same as how informative the information might or might not be to a receiver. "The cat is on the mat. The cat is on the mat." may be no more informative than "The cat is on the mat", but the former still carries more information (it requires more bits to represent, but less than "The cat is on the mat. The dog is on the log.") "Your name is X" may not be informative, but it is still information.

    Similarly, numbers in themselves are not information, because they do not encode any message - they are just there.SophistiCat

    Just where exactly? If information is distinguished from communicative acts, and from being informative, then numbers are pure information: they are answers to a series of yes and no questions, seen clearly in their binary representations.
  • SophistiCat
    2.2k
    See, this is what I was talking about: a lot of confusion is created when the term "information" is thrown around with little care given to its meaning. In your OP you were specifically referring to Shannon's theory, and Shannon's theory is all about communication. Shannon did not set out to answer the question, "what is information?" He was interested in the problem of reliable and efficient communication, and his concept of information makes sense in that context. Other concepts of information have been put to different uses. Yours, on the other hand, seems to be a solution in search of a problem.

    If you start with the question, "what is information?" the way to go is to survey existing uses of the word. Another approach would be to do what Shannon and other researchers did, which is to start with a specific problem, something that matters, and then see whether a concept with a family resemblance to "information" fits. But starting with the answer, before you even understand the question, is backwards.
  • Athena
    3.2k
    If you start with the question, "what is information?" the way to go is to survey existing uses of the word. Another approach would be to do what Shannon and other researchers did, which is to start with a specific problem, something that matters, and then see whether a concept with a family resemblance to "information" fits. But starting with the answer, before you even understand the question, is backwards.SophistiCat

    I am reading about this matter of what is information. I am reading Rudy Rucker's book "Mind Tools".
    He writes- "Anyone at the receiving end of a communication is involved with "information processing." I may communicate with you by writing this book, but you need to organize the book's information in terms of mental categories that are significant for you. You must process the information."

    Spoken and written communication are, if you stop to think about it, fully as remarkable as telepathy would be. How is it that you can know my thoughts at all, or I yours? You have a thought, you make some marks on a piece of paper, you mail the paper to me, I look at it, and by some mysterious communication algorithm, I construct in my own brain a pattern that has the same feel as your original thought. Information!"

    He goes on to explain numbers, space (and patterns that create forms), logic, and finally infinity and information. Problem is I know so little about math this book might as well be written in Greek. I am unprepared to understand what I am reading. There is so much I need to know before I can understand.

    What if we did not use words, but communicated with math? I know mathematicians can do that, but what if from the beginning we all did? I am sure my IQ would be much higher if I could do that. And I wonderful how thinking in mathematical terms might change our emotional experience of life.
  • jgill
    3.9k
    What if we did not use words, but communicated with math? I know mathematicians can do that, but what if from the beginning we all did? I am sure my IQ would be much higher if I could do that. And I wonderful how thinking in mathematical terms might change our emotional experience of life.Athena

    Interesting idea. Logicians might be able to do this, but math people use words and symbols. I have never heard of a math research paper written in math symbols only. Thinking in mathematical terms is common amongst my colleagues, but even there one talks to oneself with words.
  • hypericin
    1.6k
    I've been thinking more about this. At first I thought I was just mistaken in my op. The set of all possible arrangements of bits is countable, so it is no wonder that we can uniquely assign a whole number to every arrangement. Just because bits are countable, doesn't establish some kind of identity between bits and numbers.

    But then, the set of all quantities is countable, as is the set of points on a number line. Are these two any more inherently number than bits? Quantities have arithmetic operations defined for them that don't make sense for bits, and bits have binary operations that don't make sense for quantities. Is one set of operations more "number" than the others? Or are bit arrangements, quantities, number lines, all just countable things, and so all equally numeric?
  • jgill
    3.9k
    the set of all quantities is countable, as is the set of points on a number line.hypericin

    Not so, my friend, if we speak of the real number line. This has been chewed on on this forum until there is little left to be said.
  • hypericin
    1.6k
    so, my friend, if we speak of the real number line.jgill

    True enough. Instead of points I should have said integer marks
  • Lionino
    2.7k
    You have a thought, you make some marks on a piece of paper, you mail the paper to me, I look at it, and by some mysterious communication algorithm, I construct in my own brain a pattern that has the same feel as your original thought. Information!Athena

    Sounds like Early Wittgenstein's picture theory of language.

    What if we did not use words, but communicated with math?Athena

    How would that work, basically?
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.