• Banno
    24.8k
    *Ever had a health problem and went to see a doctor? You list your symptoms, but the doctor doesn't believe you. He believes only those that he can see or otherwise assess by himself, with whatever resources he has available.baker

    So he should believe the symptoms you make up.

    Fine.
  • baker
    5.6k
    So he should believe the symptoms you make up.Banno
    They don't.

    When you are eventually rushed into the ER with meningoecephalitis, with fever and vomitting and so on, they ask you, "Why did you wait for so long? Why didn't you come in earlier?!"


    The point being that in the real world the subjective and the invented are often equated. We cannot just dismiss this, thinking that a fancy philosophical explanation will save our day.
  • Cuthbert
    1.1k
    There are areas of knowledge (e.g. arithmetic) where we seem to make stuff up (square roots of negative numbers) but we seem unable to make stuff up just any old how ('there are no irrationals'). The propositions are not logically necessary: they can be denied without self-contradiction. But we can't make sense of much else when we deny them. We can't prove or disprove them by looking for facts. They are somehow independent of any particular experiences. But they are not just playing with arbitrary definitions. Do we need to talk about Kant?
  • Olivier5
    6.2k
    I don't know enough about this stuff to point to examples, but Turing's general point that allowing contradictions can be dangerous is almost certainly correct, precisely because of the emergence of computers.Srap Tasmaner

    Yes. Hard to understand why anyone would dispute Turing's point. If one allows contradictions within mathematics, they will spread everywhere, in all mathematics. The idea that engineering calculations would somehow remain unaffected is like saying: the logical foundations of mathematics are purely decorative, pure aesthetics, they do not actually matter at all when doing actual mathematics. They can be self-contradictory all you like, just like a poem can.

    As usual, Witty was only trying to sound witty, and as usual he tried a bit too hard.
  • frank
    15.7k
    We don't alter basic logic to fit data. Quantum theory isn't doing that.
  • Srap Tasmaner
    4.9k
    The idea that engineering calculations would somehow remain unaffected is like saying: the logical foundations of mathematics are purely decorative, pure aesthetics, they do not actually matter at all when doing actual mathematics. They can be self-contradictory all you like, just like a poem can.Olivier5

    There is middle ground here though. Foundations of mathematics is nearly a separate field of study, and unnecessary for the doing of mathematics. You can teach high school kids (and engineers) calculus without teaching them about Dedekind cuts and making a deep dive into the nature of continuity. (And it works the other way too: you might be thoroughly conversant with the independence proofs but pretty bad at solving systems of linear equations.) As @unenlightened says, theory follows practice, and in some ways this is true of mathematics as well. It's a bit confusing here because mathematics is also a theoretical subject, and when theorizing the practice already in place, mathematicians inevitably see opportunities to fiddle with things: if we need these nine axioms to ground what we've been doing, what happens if we drop number 3 and number 8? that sort of thing. The result of that sort of thing doesn't touch existing practice but does generate new, additional mathematics.

    But it's worth remembering that engineers are not waiting to find out if the continuum hypothesis is true, and the vast majority of mathematicians aren't either. A lot of the basics of probability are well understood centuries before we get Kolmogorov's axioms.
  • Olivier5
    6.2k
    Foundations of mathematics is nearly a separate field of study, and unnecessary for the doing of mathematics.Srap Tasmaner

    Good foundations are not absolutely necessary to build a good house, but they make it easier.

    When I was in first grade, my teacher was an old woman who all her life had taught arithmetic the traditional way, i.e. training kids to learn by heart and apply mechanically certain procedures to numbers, called addition, multiplication, etc. And then two years before retirement there was a pedagogic reform, and she was asked to teach "modern mathematics". What was meant by that, was mathematics based on clear axioms, derived from set theory. Us kids were supposed to learn the foundations of math (developed during the 20th century) first, in first and second grade, and only then derive applications such as addition or multiplication. This would give us a stronger background and better mathematical abilities. And it worked, at least for me. I learnt the conceptual basis for set theory and numeration, i.e. how to count in base 10 but also in base 2 or in any other base, when I was 7 years old. And I remained a math prodigy for all my school years.

    Unlike Wittgenstein, Turing knew what he was talking about. He knew that founding math on sound, non-contradictory axiomatics has been the mathematical project of the century, a project on which thousands of mathematicians worldwide have worked very very hard. It was done out of a belief that such foundations were useful and important if mathematics were to be more than just a bag of tricks.
  • Srap Tasmaner
    4.9k


    It's certainly common these days to treat set theory as fundamental, and for kids to learn naïve set theory, and I agree that's useful. But you didn't learn ZFC in elementary school and weren't taught anything about alternative axiomatizations or independence.

    What were you taught then? A lot of the mathematics people learn in classrooms is definitions and techniques. This is what we mean when we say .., this is how it works, this is what you do. Questions about whether those definitions, which support those techniques, are "good" just don't arise. And that continues to be true for much of mathematics.

    It's one of the curiosities of set theory that now and then people do worry about whether the axioms are "good", not just in having the usual mathematical virtues of being powerful enough to do the job but not more powerful than needed, but in the sense of "natural". The axioms are supposed to be like Euclid's old axioms, just spelling out our intuitions clearly. Of course there was a massive failure there relatively early on with the axiom of comprehension and Russell's paradox. We teach kids you can make a subset of "all the blue ones", but we don't tell them there are rough waters ahead if they think they can always do that sort of thing.

    Maybe this is what I'm trying to say: children are not actually being taught foundations, and not even really being taught set theory as they are taught other mathematics --- definitions and techniques. What they're being taught is an application of something they already know, that things can be grouped together, and they can be grouped together according to rules. In order to apply this basic intuition, it gets tidied up and even formalized a bit (though not much at this stage). But the idea is that sets are not introduced the way, say, tangents are later: here's the definition, it's just a thing, and we promise it'll turn out to be interesting. They're expected to nearly understand sets already, but not to realize just how much they can do with them.

    That last part -- what you can do with sets -- might turn out to be all of mathematics, but not in practice, not by a long shot. No one proves theorems starting from ZFC, and certainly no one does calculations that way. There's a sense in which the difference between calculus before the development of set theory and after is just a change in notation.

    I guess the question that's left is something like this: does our ability to express all of mathematics in the notation of set theory mean that set theory is the foundation of mathematics? Both answers to that are tempting, but perhaps that's because it's a bad question. There is no single thing that is set theory, in that sense; there are various competing ways of axiomatizing our intuitions, all of which are adequate to doing mathematics (and you actually need less than ZFC I believe to do most math).

    One last example: having later learned about cartesian coordinates, you can readily think of a line as a set of points defined by a linear equation, an infinite set. But that's not how you were taught what a line is; you were taught that it's "straight". When you learn that y = mx + b produces a line, that feels like a result, not a definition, because you already know what a line is, just as you already knew what sets are.

    On the one hand, I think I agree with Turing about contradictions mattering, but on the other hand it does seem clear to me that practice and intuition is the foundation of theory not the other way around, and you don't really need the theory, even when it comes to mathematics, insofar as foundations counts as the theory, to practice. Which is not to say that it can't be helpful. Maybe it's just that mathematics makes it clear there are at least two approaches to theorizing: one to justify what you're already doing, but one that is expected to feed back into practice. A whole lot of mathematicians do the latter without ever bothering about the former, starting from when, as tots, they learn about sets for the latter reason much more than the former.
  • unenlightened
    9.2k
    This seemed like a decent comment on the article.

    I think the liar paradox meets mathematics at division by zero. If "I am lying" creates a paradox, then what about (a^2-b^2)/(a-b)=a+b? That "sentence" is true except when a=b, in which case we are purporting to divide by zero, which we cannot do, because no such operation is defined in mathematics. Thus, where a=b, the purported division does not "fail" or "create a paradox." It is gibberish. — Remarkl

    "Do not attempt what cannot be done" is the civil engineer's mantra. Bridges that relied on division by zero might well fail.

    Pythagorus (or Euclid?) 'made up' an explanation for why the Egyptians used a 3,4,5 triangle as a set square. But if geometry (the clue's in the name) hadn't already been part of creation, it wouldn't have worked and he wouldn't have had anything to invent.

    I would suggest that mathematics is the study of possible worlds, and paradox is the study of impossible worlds such as those depicted by Escher. Beautifully precise drawings, and fascinating, but an engineers's joke. In this respect I am with W. ; there is little danger of an engineer trying to build an Escher building, or dividing by zero.

    {Possible worlds are possible structures, arrangements, orderings and disorderings processes, etc. Thus mathematics abstracts the structure from the substance of the world. Invented in the sense that there cannot be a structure of nothing; real in the sense that substance always has a structure. }
  • Olivier5
    6.2k
    On the one hand, I think I agree with Turing about contradictions mattering, but on the other hand it does seem clear to me that practice and intuition is the foundation of theory not the other way around, and you don't really need the theory, even when it comes to mathematics, insofar as foundations counts as the theory, to practice.Srap Tasmaner

    Nevertheless, allowing contradictions at the level of foundations would result in contradictions permeating the whole body of mathematics, and in the end, some calculation about some bridge may very well prove self-contradictory. So Turing was correct.
  • Srap Tasmaner
    4.9k
    Allowing contradictions in how you do calculus would cause all modern bridges to fall down. Does that matter? Is it different from the point about foundations?

    Here's another way of looking at it: we have always intended to do mathematics consistently, since long before the modern study of foundations. That's our practice. A theory of that practice is not supposed to disturb it by introducing inconsistency. But it does happen -- and mathematics is a prime example, but I think also music -- that the theory you come up with is somewhat more powerful than you need, so it supports some existing practices but also others. Now suppose some of the others it supports are not consistent with existing practices. (Not everyone wants to hear 12-tone compositions.) Are you forced to engage in these new practices because the theory authorizes them, or do you carry on doing what you were doing?
  • sime
    1.1k
    Apples and Oranges.

    Both Turing and Wittgenstein understood that sufficiently complex formal systems (.e.g Peano arithmetic) cannot be known to be consistent a priori due to the halting problem, and that inconsistency in practice must be patched as and when problems arise in application, similar to legal precedent or a sport.

    Wittgenstein argues, using the example of 20th century applications of logic and mathematics, that if logical paradoxes and incompleteness results of higher-order logic have no practical implications, then why should philosophers worry?

    Turing's point should be understood in relation to artificial intelligence and automation; whilst it is true that logical paradoxes and inconsistencies aren't relevant to manual applications of mathematical modelling in bridge design, they are potentially relevant with regards to the automation of bridge design in which artificial intelligence reasons in higher-order mathematics.

    So they should be regarded as being on the same side, considering the fact that both had no time for platonic superstition, and that both were making different points.
  • Richard B
    438
    I thought this quote from Daniel Dennett might be useful and somewhat amusing for the current discussion at hand:

    “Happily, in those days before tape recorders, some of Wittgenstein's disciples took verbatim notes, so we can catch a rare glimpse of two great minds addressing a central problem from opposite points of view: the problem of contradiction in a formal system. For Turing, the problem is a practical one: if you design a bridge using a system that contains a contradiction, "the bridge may fall down." For Wittgenstein, the problem was about the social context in which human beings can be said to "follow the rules" of a mathematical system. What Turing saw, and Wittgenstein did not, was the importance of the fact that a computer doesn't need to understand rules to follow them. Who "won"? Turing comes off as somewhat flatfooted and naive, but he left us the computer, while Wittgenstein left us...Wittgenstein.” From 1999 Time Magazine
  • Ennui Elucidator
    494
    ↪Olivier5
    Allowing contradictions in how you do calculus would cause all modern bridges to fall down. Does that matter? Is it different from the point about foundations?
    Srap Tasmaner

    Instantaneous velocity means what, precisely?
  • Srap Tasmaner
    4.9k
    Instantaneous velocity means what, precisely?Ennui Elucidator

    Do I have to be able to answer that question to build bridges?
  • Ennui Elucidator
    494
    Do I have to be able to answer that question to build bridges?Srap Tasmaner

    Well, it is just amusing you picked calculus as the place for no contradictions. Perhaps I misread you. I was just teasingly (and ignorantly) pointing out why calculus was such a big sea change over what came prior and how refusing to allow "contradictions" probably delayed its arrival by a loooong time.


    Random article, because why not?

    Vickers, Peter (2007) Was the Early Calculus an Inconsistent Theory? [Preprint]

    As Berkeley puts it (making adjustments for the given example),1

    Hitherto I have supposed that [t] flows, that [t] hath a real increment, that o is something. And I have proceeded all along on that supposition, without which I should not have been able to have made so much as one single step. From that supposition it is that I get at the increment of [5t2], that I am able to compare it with the increment of [t], and that I find the proportion between the two increments. I now beg leave to make a new supposition contrary to the first, i.e. I will suppose that there is no increment of [t], or that o is nothing; which second supposition destroys my first, and is inconsistent with it, and therefore with every thing that supposeth it. I do nevertheless beg leave to retain [10t], which is an expression obtained in virtue of my first supposition, which necessarily presupposeth such supposition, and which could not be obtained without it: All which seems a most inconsistent way of arguing... (The Analyst, §XIV)
  • Srap Tasmaner
    4.9k
    This is a curious thing, because LW approaches philosophical problems in a way that suggests practicality -- think of the opening lines of the Blue Book, say. And people often take him to be advancing a theory that emphasizes "the practical", as I've vaguely done here, talking about practices as the ground of this and that.

    But then Dennett sees Turing as the practical one here. (And I think that's right. It reminds me of Anscombe's thing about Wittgenstein being a philosopher's philosopher, not an ordinary man's philosopher, or however she put it.)

    Does any of this really address @Banno's bumper sticker claim that "maths is made up"?

    Well, it is just amusing you picked calculus as the place for no contradictions. Perhaps I misread you.Ennui Elucidator

    A little? Maybe? Most people who use calculus everyday couldn't prove the mean value theorem from scratch. (I could have done it several decades ago on demand, but no longer.) You just don't need to understand the theoretical foundations of calculus to use it consistently. I expect we all agree on that. If you want to claim that calculus has no foundation, that it is contradictory, help yourself. I'm not (any longer) competent to rebut you, but I don't recall any of my professors saying, "By the way, this doesn't make any sense."
  • Olivier5
    6.2k
    Allowing contradictions in how you do calculus would cause all modern bridges to fall down. Does that matter? Is it different from the point about foundations?Srap Tasmaner

    If you want to allow contradictions in mathematics, you need to include in your axiomatic the possibility that two mathematical statements contradicting one another can both be true nevertheless.. This means inter alia that you cannot use ad absurdum proofs anymore. You can't also limit this to certain parts of math and not others. Every theorem would be both true and false. It would be the end of math.
  • T Clark
    13.7k
    So, could the liar paradox cause a bridge to collapse?
    — Banno

    On balance, I think the answer might be yes.

    The real harm will not come in unless there is an application, in which a bridge may fall down or something of that sort [] You cannot be confident about applying your calculus until you know that there are no hidden contradictions in it.
    — Turing

    And it's yes in part because of Turing. Nowadays engineers will to some degree rely on software to design bridges. It is fact that software complexity has created enormous challenges, and that it is not nearly so simple to verify correctness as one might wish. (In some fields like aircraft design there are strict, explicit standards for the provable correctness of programs, and still ... 737.)
    Srap Tasmaner

    I read your post yesterday and have been thinking about it since then. I am far from an expert on computer programming or mathematics, but it seems to me the kinds of contradictions described by Russel, Godel, and Turing don't have anything to do with the real world, computer generated or not. I guess that's what Wittgenstein was saying.

    In my school days, I took a couple of courses in computer programming and did a little programming for an engineering project. That was in Fortran, which I guess tells you how long ago it was. Even with the simple programs I worked on, it was difficult keeping track of references and connections within and between algorithms. I find it hard to imagine how they do it with the incredibly complex programs that run the world now. There is so much complexity I find it hard to believe that a little meaningless self-reference of the kinds we are talking about will gum up the cogs in the machinery.

    The more I think about it, the more I believe that the kinds of paradoxes we're talking about have no connection to anything outside our minds. It's another example of people mistaking words for reality, the map for the territory.
  • Olivier5
    6.2k
    The more I think about it, the more I believe that the kinds of paradoxes we're talking about have no connection to anything outside our minds.T Clark

    Mathematics are in our mind, and science and technology too.
  • T Clark
    13.7k
    Mathematics are in our mind, and science and technology too.Olivier5

    I'm not sure what you're saying in relation to my post. Are you disagreeing?
  • sime
    1.1k
    There is so much complexity I find it hard to believe that a little meaningless self-reference of the kinds we are talking about will gum up the cogs in the machinery.T Clark

    Recall that Peano arithmetic might turn out to be inconsistent. In which case, an application of a deductive system based on such arithmetic might result in physically untrue predictions via explosion.

    Wittgenstein made the point in Philosophical Remarks (IIRC), that whilst such inconsistencies would lead to physically untrue predictions if applied blindly, there is no reason why the occurrence of such events would discredit uses of the system for which inconsistency plays no role. And since it is impossible to predict the existence of mathematical inconsistency before it arises (due to the the second incompleteness theorem), there is no reason to fret about the possibility a priori. We only need to patch our systems as we go.

    Wittgenstein's remarks weren't targeted towards scientists or engineers, but towards philosophers who sought to establish epistemological foundations of mathematics.

    Turing on the other hand was worried, due to his interests in artificial intelligence, where such systems might be applied blindly.
  • Olivier5
    6.2k
    Why yes. What's in our mind may at some point translate into real material structures like bridges, that are designed by someone using mathematics. If you allow contradictions to spread uncheck in engineers' minds and in their math, you may well end up with poorly conceived bridges.
  • T Clark
    13.7k
    Wittgenstein made the point in Philosophical Remarks (IIRC), that whilst such inconsistencies would lead to physically untrue predictions if applied blindly, there is no reason why the occurrence of such events would discredit uses of the system for which inconsistency plays no role. And since it is impossible to predict the existence of mathematical inconsistency before it arises (due to the the second incompleteness theorem), there is no reason to fret about the possibility a priori. We only need to patch our systems as we go.sime

    From my limited perspective, it seems like the kinds of inconsistencies we are talking about are trivial and, really, meaningless. As I noted previously, when I read the proof of Godel's theorem, I couldn't understand why mathematicians and logicians thought it was important. As far as I can tell, it may tell us something about the foundation of mathematics defined in a very rigid way, but it says nothing about anything that might apply in the world.

    Please, convince me I'm wrong. I find it hard to believe that my thoughts would overturn the concerns of the greatest philosophers and mathematicians.
  • T Clark
    13.7k
    Why yes. What's in our mind at some point translates into real material structures like bridges, that are designed by someone using mathematics. If you allow contradictions to spread uncheck in engineers' minds and in their math, you may well end up with poorly conceived bridges.Olivier5

    To me, that's like saying the sentence "This sentence is not true" may slip it's leash, escape, and undermine the usefulness of the English language.
  • Olivier5
    6.2k
    That's because you take the whole question of 'can the liar's paradox break bridges' a bit too literally. The real question hidden behind this tag line is: should math allow contradictions? I.e. should we get rid of the law of excluded middle in math, or would that lead to poorly designed bridges?
  • Amalac
    489
    Maths is made up.Banno

    I know this isn't the main point of Turing and Wittgenstein's dialogue, but:

    It seems strange to say that we made up numbers like e or π. We don't know what the 10000000000000 trillionth digit of e is, yet if we invented e shouldn't we know that?

    How could we not know something about that which we made up? If I made up the tenets of a religion, I should know everything about those tenets I made up, right?
  • Shawn
    13.2k
    Things in engineering are usually defined as sometimes overdetermined or positively redundant, in practice.

    Epistemic closure of mathematics or its inability being used in practice doesn't prohibit a computer from modelling a bridge, and neither does it not prohibit an engineer from making redundancy measures to keep a boat afloat after hitting an iceberg, by compartmentalizing the ship.
  • Joshs
    5.6k
    What Turing saw, and Wittgenstein did not, was the importance of the fact that a computer doesn't need to understand rules to follow themRichard B

    And what Wittgenstein saw , and Turing and Dennett did not , was that the computer’s actions mean nothing without an interpreter.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment