• fishfry
    3.4k
    I'm not sure why NOR is important to you
    — fishfry

    Because it's a sole sufficient operator.
    Pfhorrest

    I remember my logic professor telling us about the Scheffer stroke, which can be used to define all the other logical operations. It's equivalent to NAND. So NAND is as good as NOR for your purposes. But isn't it mostly a curiosity? We can define all the other logical relations in terms of a single one. Ok. But why make it a focus of interest?
  • fdrake
    6.6k
    But why make it a focus of interest?fishfry

    I think it's a fetish for parsimony. I vaguely recall that having a single logical connective simplifies proofs and the well formed formulae definitions (you just nand or nor stuff). But I think that this is more relevant if you're ball deep proving things about formal logics than providing an overview of them.

    Would appreciate it, if you've read what I've written, to point out bullshit where I've said it.
  • Pfhorrest
    4.6k
    Yeah basically the idea of this game is to start with as little as possible and then build everything up from there. You don’t have to limit yourself to using just NOR after
    you’ve built all the other connectives, it’s just part of the game to show that you can start with just NOT and don’t need to start with anything else as given.
  • fdrake
    6.6k


    Feel free to intervene in the bits I've written to add further detail if you like.

    with as little as possible and then build everything up from therePfhorrest

    Define the hand wave operator H as a function on the cartesian product of the axiom set of a formal system with its theory... The language with the hand wave operator is equivalent to the language without the hand wave operator when and only when for all axioms A and for all theorems B H(A,B) is true when and only when the system has A implies B as a tautology...
  • Pfhorrest
    4.6k
    Feel free to intervene in the bits I've written to add further detail if you like.fdrake

    Will do once I have time to read them. When I’m away from home I only have brief moment like bathroom breaks to read and write short things like this.
  • Pfhorrest
    4.6k
    I finally snuck in enough time to read your excellent posts, though I’m still on a phone so my reply will have to be short. While your posts were a fascinating read that refreshed a lot that I’d half forgotten and maybe even taught me some things I never learned properly, I’m worried that the reason you initially found this project so intimidating is because you’re putting so much focus on proof, and expecting that we’re going to rigorously prove every step of the process. All I really intended us to do was to dramatically state the conclusions of each step. In my OP, for example, I did not even try to prove that the empty set and its successors behave isomorphically to the natural numbers and so can be regarded as equivalent to them: I just stated how to generate that set of sets of sets with the successor function (and how to build that function), and then asserted that that is the set of natural numbers, with a dramatic flourish.
  • fdrake
    6.6k
    All I really intended us to do was to dramatically state the conclusions of each stepPfhorrest

    Fair. Would you be interested in doing the more detailed version I had in mind?

    rigorously prove every step of the processPfhorrest

    It is likely that we imagine different things by rigorous proof. To me, what I've written is still highly informal and suggestive.
  • Pfhorrest
    4.6k
    Fair. Would you be interested in doing the more detailed version I had in mind?fdrake

    I would be very interested in reading it, but I doubt I have the competence to contribute to it.
  • Gnomon
    3.8k
    To begin with, there is the empty set, and the empty set is without contents and void.Pfhorrest
    I'm afraid I won't be of much help in developing the detailed aspects of your game. Russell and Whitehead began their Universe of Discourse with Set Theory as the foundation of Logic, but ended-up running into the impassible boundary (incompleteness theorem) of space-time limitations.

    So, in the interest of surpassing that inherent constraint on human reasoning, I'd recommend starting with something that transcends the actual Universe, such as the Greek notion of LOGOS. Of course, our knowledge of Logos is limited to imaginary abstractions such as the concepts of Eternity and Infinity. But, at least, the game will acknowledge that the game-board has its limitations. So that all moves will be <= ထ. :nerd:
  • Pfhorrest
    4.6k
    Russell and Whitehead began their Universe of Discourse with Set Theory as the foundation of Logic, but ended-up running into the impassible boundary (incompleteness theorem) of space-time limitations.Gnomon

    Logicism failed, but set theory is nevertheless the foundations of contemporary mathematics. That's not identical with logicism.
  • Gnomon
    3.8k
    Logicism failed, but set theory is nevertheless the foundations of contemporary mathematics. That's not identical with logicism.Pfhorrest
    I wasn't referring to the Logicism of Analytic philosophy, but to the Greek notion that there is some absolute Truth, equivalent to Ideal Proportions, which can be expressed mathematically in ratios. I don't know how that might apply to your forum game, except to serve as an ideal goal, approachable but never attainable --- I.e. asymptotic to infinity. It's a barrier, but it also leaves a lot of room for experimentation.

    PS___Sorry to butt-in with irrelevant comments. But I had been thinking about the relationship of Logic to Math. My personal pragmatic definition of philosophical Logic has been "mathematics with words". Which means that it is less pure & abstract & absolute. So, maybe the Greek Logos was actually referring to the mathematics of Proportion (e.g. positive vs negative; good vs evil), and its association with human language is secondary. I'll leave you alone now. :yikes:
  • fdrake
    6.6k
    So far we built a system of deduction - the rules of inference of propositional logic - on top of a way of writing down grammatical statements within that logic -the collection of well formed formulae of propositional logic -. This was a discussion of the syntax of propositional logic. We then added a formal semantics to propositional logic; a formal semantics was a way of taking well formed formulae and assigning them to a truth value; this was achieved truth functionally, a well formed formula evaluated to true (or false) depending solely on the constituent propositional symbols' truth values and how they fed into the logical connectives. Since the syntax and semantics are formally distinct, this highlights the possibility of a gap between syntax and semantics of a formal language; and this gap is a site of interesting questions.

    To discuss these questions with a bit more detail, we need the notion of a theory. A theory is just the collection of derivable consequences from a stipulated set of rules; the collection of all theorems; derivable consequences; of the language. Notably, a theory is closed under the application of the inference rules to its constituents - this means that you can't get extra consequences by applying the inference rules to something which we call a theory.

    Accompanying the notion of a theory is the notion of a model; a model of a theory (or of a collection of stipulated well formed formulae and rules) is a collection of objects in which every element of the theory interprets as true.

    EG if we have the rule "every element of a set has to have a strictly bigger element", any finite set violates the rule and so is not a model. If we have the rule , a model of that formula has false and true, or true and true. Models typically are not unique.

    In propositional logic, a model of a formula is just a row of a truth table under which that formula evaluates as true; assignments of truth/falsity to the proposition which make it true. A model of a collection of formulae is an assignment of truth/falsity to propositions which make them all true.

    This lets us restate some things: eg occurs when every model of propositional logic has evaluate as true; that is, denotes a semantic tautology - something which is necessarily true.


    (Soundness) If a well formed formula can be produced through syntactic operations without stipulating any assumptions, is it the case that the semantics will always evaluate such a statement as true? Stated more briefly, is every theorem (syntax) of the system a tautology (semantics) of the system? If something is provable, is therefore true? When this property holds of the logic, it is called sound.

    This can be written implies .

    Propositional logic is sound.

    (Completeness) If a well formed formula always evaluates as true under every possible assignment of truth values (semantics) to its propositions, is it the case that the system can derive the formula using its rules of inference (syntax)? If something is true, is it therefore provable? When this property holds of the logic, it is called complete.

    This can be written implies .

    Propositional logic is complete.

    (Consistency) If a collection of well formed formulae can all be true at the same time, then that collection is called consistent. If the theory of a system of inference is consistent, then that theory is consistent.

    This can be written: not where is a theory. Conversely, inconsistency says "no model exists for this theory", and "its elements can't be jointly true together/satisfied".

    Propositional logic is consistent.

    (Decidability) Does an algorithm exist that lets you check whether an arbitrary well formed formula of propositional logic belongs to the theory of propositional logic?

    Propositional logic is decidable.

    These are all nice properties to have, but we still couldn't write an argument like:

    "Every number has a square, therefore two has a square".

    We're missing quantifiers, and just by introducing them we gain a lot of expressive power but lose decidability in almost every context of interest for mathematical reasoning. We also complicate the idea of a model; "every natural number has a square" must range over infinite collections - the truth table method of checking whether something is a tautology just won't work any more.
  • fdrake
    6.6k
    One of the major reasons we need to add quantifiers to a formal basis of mathematics is that we need to be able to deal with infinite domains of objects, and to stipulate properties of those objects. "Every natural number greater than 1 has a prime factor" could only be formulated within the language if we were able to talk about "every number" at once. If we had a system like for propositional logic, "Every natural number greater than 1 has a prime factor" might be rendered like:

    2 has a prime factor.
    3 has a prime factor.
    ...

    We'd need to do that infinitely many times. This would mean we would need to write down infinitely many statements to formalise extremely elementary properties of numbers. Also, within the statement "has a prime factor" there is another subtlety: a prime factor of a number is a number that is prime that divides the number. IE, "has a prime factor" means "there is some number which is prime that divides the number".

    The device which implements the "every number" part of "every number greater than 1 has a prime factor" is called the universal quantifier, written and read "for all". The device which implements the "there is some number" part of the statement is called the existential quantifier, written and read "for some" or "there exists".

    There's another element of that statement which we could not render without additional vocabulary; a number being prime, or having a prime factor, is a property of some number. Moreover, say, 12 having 3 as a prime factor is a relation between 12 and 3. Propositional calculus alone does not have the resources to talk about properties or relations, so we need to augment the language with resources for that. We'd like to be able to have a condensed notation that reads "12 has 3 as a prime factor" and "3 is prime" and so on. The latter might be written as , the former might be written as . Generally, these are called predicates; properties, relations, functions and so on.

    This means the extra principles that need to be added to propositional logic to be able to implement these kinds of statements are:

    (1) Quantifiers,
    (2) Predicates: properties, relations, functions etc.

    In addition we would like to be able to have all of propositional calculus as valid formulae of whatever language we set up. So this gives us:

    Predicate logic = Propositional logic + quantifiers + predicates.

    These are required changes in the syntax, and they will induce changes in the semantics appropriate for them. EG: we can't just evaluate a truth table to see if "Every number has a prime factor" is true.
  • fdrake
    6.6k
    The rules of the formal language of first order logic tell us how we can build up well formed formulae out of symbols, and what symbols there are.

    So we need to specify what the building blocks (called terms) of predicate logic are:
    (T1) An alphabet of symbols; here called variables. Every variable is a term. EG, x,y
    (T2) Functions on those symbols: eg, if x and y are variables, then P(x,y) is a term. This holds for every function independent of the number of things that go into it, eg Q(x,y,z)... All of these function expressions are terms. Functions without any arguments are called constants, and these are also terms.

    Reveal
    It helps to throw in functions into the terms to make it so that function evaluations can have predicates applied to them without applying predicates to predicates. Helps in the sense of keeping the theory first order.


    Functions take terms and map them to other terms. They are like replacement rules in in the first post. They take an input and produce an output. EG the function defined by takes any input and produces the term .

    These define the objects of the theory of predicate calculus; the alphabet which its well formed formulae will be written with.

    On top of these term notions we need well formed formulae notions; ways of producing well formed formulae given other well formed formulae, like with propositional logic and TOY in the first post.

    (W1) If is a predicate, then of it arguments is a well formed formula. EG if P is "is prime" and x is "2" then P(x) is a well formed formula. EG if D is "divides" and x is 2 and y is 4, D(x,y) is "2 divides 4". As before, D(y,x) is also a well formed formula, it just happens to be false.

    (W2) If and are terms, then is a well formed formula.

    (W3) If is a well formed formula, so is .

    (W4) If is a binary connective and and are well formed formulae, then is a well formed formulae.

    (W5) If is a formula and is a variable in it, then and are well formed formulae.

    These set out the basic grammar; the rules of well formed formula production in predicate logic. An example sequence of produced well formed formulae is:



    Where x and y are terms. This uses W2, then W5, then W5, then W4 with for logical disjunction.

    Function terms let us write things like:



    For things like "every number x is less than f(x)=x+1" with L(x,f(x)) being x is less than f(x).

    Note; the only things which can be quantified over to produce well formed formulae are variables in formulae, not predicates or function symbols. This is what makes the predicate logic first order; first order means quantification only over terms (term symbols), not over predicates applied to terms (predicate symbols).

    EG: "All colours are sensory properties of objects" is not a first order statement, since colours are properties. Whereas "all natural numbers have at least one number which is larger than them" is a first order statement, as the "all" works on natural numbers and the "at least" also works on natural numbers.

    It remains, again, to build a system of inference on top of these grammatical rules of the formal language.
  • DonEddy
    1
    How do all of you know all this. I just happened to stumble across this.. signed up just to ask this question. Where are these words coming from when your explaining "; the rules of well formed formula production in predicate logic. An example sequence of produced well formed formulae"
    What kind of math is this and who did you learn it from

    Asking for a friend lol
  • Pfhorrest
    4.6k
    I picked up what little of this I knew from extracurricular reading in introductory places like Wikipedia after getting interested in the foundations of mathematics (which is the name of this broad part of mathematics) by way of philosophy and logic (after previously having given up on formal mathematical education when I dropped out of calculus).

    Thanks for the continuing posts! Haven't found time to read them yet but I hope to soon.
  • fdrake
    6.6k


    I'm a mathematician who studied a decent amount of logic. The material I've presented here is recapitulating stuff from 2 undergraduate courses in logic. I've checked Stanford Encyclopedia of Philosophy, the Internet Encyclopedia of Philosophy and Wikipedia on what I've written to make sure what I've written isn't completely insane.

    Your friend might want to study "natural deduction", "first order logic" and "formal languages".
  • fdrake
    6.6k
    Let's stop to consider the richness of the language of first order logic, the kind of things that we can express with it.

    (1)"Every number has a prime factor"

    where | is "divides" and P is "is prime" and & is conjunction.
    (2)"There is no largest prime"

    where p and q are prime numbers and is greater than.
    (3)"Every fraction can be multiplied by some other fraction to obtain the number 1 and that fraction is unique"

    where x is a fraction, y is some other fraction, is a function term which corresponds to multiplication.
    (4) "If a is on the same landmass as b, and b is on the same landmass as c, then a is on the same landmass as c"

    (5)""
    (the distributive law of multiplication and addition).

    Reveal
    equality, is assumed to just be part of the logic, it occurs when the term on the left and the term on the right are different names for the same term, or the term itself like in or


    One trick here is that things like above are functions; f(x,y) = x+y, if you input x and input y, you get x+y out, so f(1,2)=1+2=3. Subtraction and multiplication work similarly. All the elementary operations in mathematics can be inputted into a first order language as function symbols of their argument terms, and their output interpreted as another term!

    Within the above statements, there is an implicitly assumed domain in the background and an implicitly assumed interpretation of the function and predicate symbols.

    Regarding the domains: in the first statement, the domain is the numbers 1,2,3,4... . In the second statement, the domain is the primes 2,3,5,7,... . In the third statement, the domain is the fractions. In the fourth statement, the domain is the landmasses (of Earth). In the fifth statement, the domain is something like the naturals or fractions or reals where multiplication and addition interact.

    Regarding the function and predicate symbols: in each of these cases, the formal language requires extra symbols to augment the domain with to make sense of the statement. In the first, we add a binary predicate (relation) symbol | for "divides" and a unary predicate (property) symbol P for "is prime". In the second we add a relation symbol for "greater than". In the third we add the multiplication symbol . In the fourth we add the relation symbol . In the fifth we add the multiplication symbol and the addition symbol .

    All of these contexts are very different, but first order logic lets you express things about all of them. In that regard, we need to pay attention to precisely what we're talking about; what domain of discourse and predicates and functions make sense on it, and how those predicates and functions interact. And in terms of formalisation, set out exactly what we're talking about to begin with; have an interpretation in mind, an intended semantics.

    Reveal
    "intended semantics" is not a standard term, though by it I mean intended interpretation. As we saw with the rows of truth tables, it is quite rare that models that satisfy our stipulations are unique, and the same holds for first order theories.


    Due to the expressive power of the predicate symbols and function symbols, we can posit much different structures depending on what predicate symbols and function symbols we augment the domain with. For an example, consider the natural numbers equipped with addition; the only statements which make sense in this context are natural numbers added together and equations between sums of natural numbers. Like 5+2=3+4=7, or "if 5+2 = 3+4 and 3+4 = 7, then 5+2=7", but no statements including multiplication.

    If we then add multiplication to this "natural numbers with addition" structure, we can start talking about equations involving addition and multiplication.

    This is to say, while in propositional logic, the system of inference (the theories of propositional logic) depended entirely on the logical structure of argument (the valid ways of using logical connectives between elements of he domain), the theories of first order logic depend on stipulated rules of functions and predicates as well as the rules of propositional logic + quantifier rules.
  • fdrake
    6.6k
    The bare bones rules of propositional logic and first order predicate logic are quite similar; first order predicate logic just expands on or rewrites the rules of propositional logic.

    A propositional logic argument, modus ponens, detailed here goes like:

    "If P then Q, P, therefore Q"

    Much the same thing in predicate logic is:


    "If P holds of x this implies Q holds of x, P holds of x, therefore Q holds of x".

    Similar structures apply for conjunction, disjunction etc:

    EG:

    (Conjunction Elimination) and
    (Disjunction) implies or

    Reveal
    The logical connectives range over well formed formulae as arguments, rather than just predicates applied to x, but the same behaviour applies - the logical connectives work for arbitrarily complicated formulae, eg:
    and so on


    where works as before (but now with the well formed formulae of first order predicate logic).

    There are also rules for quantifiers: universal instantiation and universal introduction - which define how for all works and existential instantiation and existential introduction - which define how works, these read:


    (Universal instantiation) For every well formed formula , for every in the domain. This says "if a statement holds of everything, it holds of each particular thing".
    (Universal introduction) If for the well formed formula involving arbitrary variable holds/is stipulated , then
    This says "if a statement holds of each particular thing/an arbitrary thing, then it holds for everything".


    (Existential instantiation) If then in particular for some c in the domain. This reads "if a thing holds of something in the domain, it (at least) holds of a particular we can label as c".
    (Existential introduction) If for some constant element of the domain some well formed formula holds/is stipulated, then holds, where x is a variable. This says "if something holds of a specified particular thing, it holds of something in general".

    The rules for quantifiers also include two descriptive notions which are important; free and bound variables. A variable occurrence is bound in a well formed formula if some quantifier acts on it in that well formed formula and free otherwise. EG

    has free.
    has bound in the first part and free in the second.
    has bound alone.

    Only well formed formulas where every variable is bound can evaluate unambiguously to true or false; otherwise they depend on the variables instantiating the claim. Well formed formulae solely involving constants need not be bound in this manner to be true or false. EG:

    Reveal
    Well, constants can't be quantified over, formally speaking anyway, only variables can be.


    "Socrates is a man" is true when speaking of philosophers since Socrates is a fixed term of the domain of philosophers and is a man.
    "x is a man" might be true or false when speaking of philosophers, because x might be Elisabeth Anscombe who is a fixed term of the domain of philosophers and is a woman. In this case x is free.
    "There exists an x such that x is a man" is true when speaking of philosophers because (at least) Socrates exists. In this case x is bound.
    "Every x is a man" is false when speaking of philosophers because (at least) Elizabeth Anscombe exists In this case x is bound.

    Expressions involving constants and variables at the same time are common, eg:

    "For all natural numbers x, if x is even and x is prime then x = 2"
    the constant 2 shows up, the variable x shows up, the predicate symbols "is even" and "is prime" show up, all variables are bound so this statement is either true or false, and it turns out to be true since 2 is the only even prime.

    Given all this, we can weave the inference rules (this post) and the predicate/function behaviour together to start talking about formal semantics of first order logic; theories and models of stipulated function/predicate rules.

    This is, finally, when we get to start talking about axiomatic systems like the Peano Arithmetic and ZF( C ).
  • jgill
    3.9k
    I'm a mathematician who studied a decent amount of logicfdrake

    I'm one who hasn't. Complex analysis here. And you? :cool:
  • fdrake
    6.6k
    And you?jgill

    Statistics! This is all very pure for me.
  • Pfhorrest
    4.6k
    Equivalently, consistency says "no model exists for this theory", and "its elements can't be jointly true together/satisfied".fdrake

    I think maybe you meant to write “inconsistency” there?
  • fdrake
    6.6k


    Fixed, thank you.
  • fdrake
    6.6k
    We're going to start talking about the semantics of first order predicate logic now, and relate it to the syntax. In propositional logic, the semantics was all about setting out how well formed formulae are true or false by providing an interpretation of all the logical connectives and constituent propositions; an assignment of true or false to each individual proposition which partook in the well formed formula.

    Last post we saw that the overall behaviour of a first order theory depends on the function symbols and predicate symbols and their stipulated relationships; eg how the natural numbers work with addition alone vs how the natural numbers work with multiplication alone; and in this regard, all the function symbols and predicate symbols need to have an interpretation.

    To set out the building blocks of a first order theory, a device called a signature is used. A signature has the following symbolic components:

    SYMBOLS (SIGNATURE, SYNTAX)
    (SIGNATURE 1) A collection of constant symbols.
    (SIGNATURE 2) A collection of function symbols
    (SIGNATURE 3) A collection of predicate symbols..

    Everything which goes into the syntax of a first order theory is a labelled element of a signature, except for the assumed logical connectives, quantifiers and variable symbols. We can also stipulate behaviour for the constants, functions and predicates.

    An example is:



    which has the set symbolised with of constants (which are intended to be natural numbers) and the function symbol (which is intended to be addition). If we stipulate that the set is the natural numbers and + is addition - assigning the symbols with their usual meaning - this becomes the usual natural numbers we're familiar with and the usual addition we're familiar with and we can write things like 1+1=2 in the language. Such an assignment is how a first order formal language obtains a syntax!

    MEANING (ASSIGNMENT, SEMANTICS)
    (ASSIGNMENT 1) A collection of constants - called the domain.
    (ASSIGNMENT 2) A collection of functions - called functions on the domain.
    (ASSIGNMENT 3) A collection of predicates - called predicates on the domain.

    Variable symbols can also be included, but behave differently.

    An interpretation of a first order theory takes each element of the signature (SYMBOLS) and associates it uniquely (assigns it) with a part of the assignment. Constants symbols go to constants, variable symbols go to a range of constants, function symbols go to functions, predicate symbols go to predicates.

    All an interpretation of a first order theory does is associate all the parts of the signature list to a unique part of the assignment list. Note that there are variable symbols but no part of the signature has variables in it, so variables require a slightly different treatment.

    The distinguishing feature of variables is that they can be reassigned to other constants. This range of assignments is what makes them variable. Their labels don't matter very much. EG "For all natural x, x is either prime or composite" means just the same thing as "For all natural a, a is either prime or composite". For any formula, you can replace a variable label with another variable label and not change the meaning of the formula so long as the variables belong to exactly the same object. EG, "for all a, a is either prime or composite" might be false if for some reason we've specified that the symbol a must be a fraction. The intuition this embodies relates to the basic behaviour of the two quantifies:

    (1) For some x P(x) is true when and only when we can assign x to some value a and P(a) is true.
    (2) For all x P(x) is true when and only when all assignments of x have P(x) as true.

    The difference between variables and constants, then, is that variables are defined relative to their ability to be reassigned other elements of the domain and constants cannot be reassigned. The number 2 is the number 2 is the number 2, the number x is... well whatever x could be. The only important thing for using variables appropriately is to use them consistently once you've introduced them.

    Setting out how variables, constants, functions and predicate SYMBOLS associate with the ASSIGNMENT list sets out what variables, constants, functions and predicates mean in the language! And once we've set up what they all mean; once we've assigned all the (non-variable) symbols to a unique part of the signature; we can begin looking at what is true and what is false in the language.
  • fdrake
    6.6k
    Let's look at an example, this will contain how we assign predicates meanings (what predicates formally mean) in a simple example, and how we specify a signature and prove theorems about it.

    Consider the domain:

    FOODWEB={grass, sheep, wolves}

    we are going to assign these words to their usual meanings; grass, sheep, wolves.

    And the relation "eats" as it applies to FOODWEB.

    sheep eat grass
    wolves eat sheep

    We can characterise the relation "eats" as the collection of pairs:

    {sheep, grass}, read "sheep eat grass"
    and
    {wolves, sheep}, read "wolves eat sheep"

    We then collect all the pairs into a single object:

    { {sheep, grass}, {wolves, sheep} }

    This is what we assign to "eats" to flesh out its meaning.

    So the assignment of "eats" is { {sheep, grass}, {wolves, sheep} }.

    The signature is then {FOODWEB, eats}

    FOODWEB is assigned to {grass, sheep, wolves}
    eats is assigned to { {sheep, grass}, {wolves, sheep} }

    We can prove theorems about FOODWEB now.

    Theorem: "nothing eats wolves"
    Proof: Let x be in FOODWEB, then x is grass or x is sheep or x is wolves (definition of FOODWEB)
    A (Disjunct 1) Assume x is grass, then x does not eat wolves (definition of eats)
    B (Disjunct 2) Assume x is sheep, then x does not eat wolves (definition of eats)
    C (Disjunct 3) Assume x is wolves, then x does not eat wolves (definition of eats)
    D (Disjunction Elimination) Then x does not eat wolves (disjunction elimination/proof by cases). (using lines A,B,C).
    E (UI) Since x was arbitrary, then for all x, x does not eat wolves (line D, universal introduction/universal generalisation, introduced in thread here).

    What this shows is that {FOODWEB,eats} "nothing eats wolves".

    In this case, the entailment is semantic because it follows from the assignment (and the logic rules assumed in the background).

    We could turn question around: "find a signature and an assignment in which nothing eats wolves is satisfied", and in that case we find that {FOODWEB, eats} with the above assignment works. This is how models work for first order logic - {FOODWEB, eats} models the claim "nothing eats wolves".

    There are other domains and assignments that satisfy this! {grass, wolves}, {sheep, wolves}, {vegetarians, sheep}... with the usual meaning of "eats". It is rare that if we start off with the theorems we'd like, that there is a unique model which satisfies the theorems.
  • alcontali
    1.3k
    Logicism failed, but set theory is nevertheless the foundations of contemporary mathematics.Pfhorrest

    Logicism did not fail. It just hasn't achieved its goals.

    Logicism would have "failed" if someone had provided proof that the 10 axioms of ZFC set theory (or even the 9 axioms of PA number theory) cannot possibly be derived from the 14 axioms of propositional logic. I have never seen such proof.

    It would have "succeeded" if they had successfully been able to build set theory (and/or number theory) as a dependency inside logic theory. They haven't been able to do that either.

    Therefore, the status of logicism is not "failed" nor "succeeded", but "undecidable".
  • fdrake
    6.6k
    The example in the last post of "eats" is a blueprint of how functions and predicates are formally constructed and assigned on a domain.

    Properties are assigned to lists. EG: "even" = {0,2,4,6,8,...}
    Relations are assigned to lists of pairs: EG: "eats" from the last post.
    ...

    Restating something: properties and relations are predicates. Predicates are assigned a number called an arity, which tells you how many things go into them. Properties have arity 1, relations have arity 2. Predicate logic allows you to deal with predicates of arbitrary arity.

    Properties are assigned to lists of single objects (arity 1 predicate, list of single elements)
    Relations are assigned to lists of pairs (arity 2 predicate, list of paired elements)

    This pattern continues indefinitely.

    Properties are assigned to lists of single objects (arity 1 predicate, list of single elements)
    Relations are assigned to lists of pairs (arity 2 predicate, list of paired elements)
    Predicates with arity three are assigned to lists of triples (arity 3 predicate, list of triples)
    Predicates with arity x are assigned to lists of x-tuplets (arity x predicate, list of elements of length x).

    Functions are constructed similarly.

    The function f(x)=x+1 on the natural numbers is exhaustively determined by the list:

    1->2
    2->3
    3->4
    ...
    n->n+1
    ...

    This can be rewritten as the list of pairs
    {1,2}
    {2,3}
    {3,4}
    ...
    {n,n+1}
    ...

    And then aggregated into a single object as with predicates

    { {1,2}, {2,3}, {3,4}, ... , {n,n+1}, ... }

    A function of two arguments: f(x,y) = x+y works like

    (1,2)->1+2->3
    (3,4)->3+4->7
    ...

    Equivalently
    {1,2} -> 3
    {3,4} -> 7
    ...

    Which is rewritten as:

    {1,2,3}
    {3,4,7}
    ...

    The number of arguments a function has is also called its arity. A function of arity n is associated with a list of elements of length n+1, representing the n inputs in order and then the output as the last element.

    It is convenient to have a structure which compiles all the objects types: "lists of single elements", "list of pairs", "list of triples", ... , "list of x-tuples" together. This is called a Cartesian product
    Reveal
    (most generally a product object
    .

    A list of pairs of all pairs of elements from a domain is written as the Cartesian product .

    A list of all triples of elements from a domain is written as the Cartesian product .

    And so on.

    Generally:

    A predicate of arity on a domain must be assigned to some subset (sub-list, some part) of the Cartesian product of with itself times.

    Functions are similar, but more constrained:

    A function of arity on a domain must be assigned to some subset (sub-list, some part) of the Cartesian product of with itself times. Additionally this subset must satisfy the property that each element of Cartesian product-ed with itself times occurs once in the subset (every input list must produce a unique output - a unique term of ).
  • fdrake
    6.6k
    In case it is not obvious why functions should be defined as having the property "if two inputs to the function are the same, the output should be the same" and allowed as term expressions:

    (Technical reason) Without that restriction, a term could transform into a collection of terms, and a statement like would both require a lot more effort to make sense of (what would it mean for a number to be less than a collection of numbers, say?) and "for all x, x < (some set)(x)" is equivalent (I think) to quantifying over a predicate (a collection of domain items dependent on x) - it breaks the first order-ness of the theory.

    (Expressive reason) It means that you have a way of describing ways of transforming a term into another term. To reiterate: you can describe properties and relations of ways of transforming terms into other terms. These are incredibly important ideas, functions let you talk about change and transformation, in addition to relation and property. The concepts this allows the logic to express are much broader.

    Signatures also let us stipulate properties (strictly: well formed formulae that can be true involving) their constituent elements (domain elements, functions, predicates), EG, if we have (have in mind "natural numbers when you can add them and multiply them and nothing else"), we can stipulate that and obey:

    , which is the (left) distributive law.

    Stipulating the distributive law lets you derive this as a theorem:



    The first and second equalities follow from the distributive law. This is, hopefully, the familiar FOIL (firsts, outsides, insides last) or "smiley face" method of multiplying binomials together from high school algebra.

    This can be written formally as

    when are understood as obeying the distributive law.

    Usually the signature and the axioms which define the properties and relations of its predicates and functions will be the intended meaning of just writing down the signature. This is a notational convenience, it does no harm. We could introduce an extra list of axioms for the operations and add them to the signature (analogous to building a system of inference over well formed formula production rules), but usually in other areas of mathematics this is omitted.

    When you leave this regime of mathematical logic it is extremely convenient to just assume that whenever we write down a signature, we implicitly give it its intended interpretation. This specifies the structure of interest for mathematical investigations/theorem proving in terms of what can be proved and what we hold to be true, but it does not block people from analysing other models; called non-standard models; of the same signature.

    Because of the waning relevance of the distinction between syntax and semantics for our purposes, I am not going to go into all the formal details of linking an arbitrary signature to an arbitrary interpretation; but I will state some general things about models to conclude, and end our discussion of them with a mathematical justification for why set theory is in some sense the "most fundamental" part of (lots of) mathematics.

    (1) Signatures do not necessarily, and usually do not, have unique models. More than one object can satisfy the stipulated properties of a signature. This applies as much for things like arithmetic as it does for my contrived examples.

    (2) Models can be different, but not be interestingly different. If we decided to write all the natural numbers upside down, we would obtain a different set that satisfies all the usual properties of the natural numbers; a formally distinct set anyway. If we started calling what we mean by "1" what we mean by "2" and vice versa, we would have a formally distinct model. But these models are entirely the same for our purposes. In this regard, models are equivalent when each truth of one obtains just when an equivalent corresponding truth obtains in the other. EG, if we relabelled the numbers 1,2 by a,b,c, we would have:

    holds in our new labelling when and only when holds in the usual labelling.

    When this property holds between models, they are called isomorphic.

    (3) Models are extremely informative about the syntax of a formal system, for example, you can prove that the truth value of a statement is independent from a system of axioms (neither it nor its negation are derivable) by finding a model of that system where the statement is true and finding a model of that system where the statement is false. Alternatively, one can assume the statement along with the formal system under study and show that it entails no contradiction (IE that it is consistent) and then assume the negation of the statement along with the formal system under study and show that this too entails no contradiction. This proof method was what showed that the continuum hypothesis was independent of the axioms of set theory ZFC; a result partially proved by Godel and then the rest by Cohen.

    Lastly, you have no doubt noticed (if you've been paying attention) that the discussion of first order logic required us to talk about collections of objects; eg for the domains, for the definition of functions and predicates, for how the quantifiers worked... It showed up pretty much everywhere. If it were somehow possible to make a bunch of axioms of a privileged few symbols and bind them together with a signature that captured how collections of objects behave, whatever modelled such a theory would be an extremely general object. It would contain things like natural numbers, groups in algebra, functions in analysis... It would contain pretty much everything... And in that regard, the theory that expressed such a structure would serve as a foundation for mathematics.

    This happened. It's called ZFC, for Zermelo Fraenkel set theory with the axiom of choice. So that's what we'll discuss (very briefly, considering the complexity of the topic and my more limited knowledge of it) next.
  • Pfhorrest
    4.6k
    Hurray, we're finally just about up to what I initially thought of as the beginning!

    Would it be fair to say that thus far, we have only discussed the language by which we would reason about any kinds of objects, but (other than for examples) we have not yet formally introduced any objects about which to reason? And that's what we're about to start doing with sets?
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.