TransWikia.com

How were formal systems and notion syntactic consequence (proof) developed?

Philosophy Asked by csp2018 on December 26, 2021

I’ve looked at several resources to learn about logic and metalogic, and they first present syntactic consequence and semantic consequence as separate things and then try to show how each implies the other in a sound and complete system.

But how did humans develop formal systems and the notion of syntactic consequence in the first place? Wouldn’t they have had to develop such systems based on what semantic consequences they agreed with?

I’m confused by the process of just arbitrarily setting up a formal system and saying ‘if a formula follows the rules of inference then there is a proof,’ and then trying to prove if it agrees with some semantic model.

I’m sure there is a good reason, but I would like to get a good intuitive sense of why, and these kinds of “soft” issues are usually glossed over in learning materials about logic that I’ve come across.

2 Answers

Not really an answer, but an attempt at giving an idea of a the syntactic approach.


  • Suppose you want to prove that if n = a+a then, logically, n = 2a.

  • If you want to prove the statement is true for a small domain say, for 0, 1, 2...... 9 , you may use a semantic method. That is, you will consider all the possible interpretations of the sentence:

0+0 = 2.0

1+1 = 2.1

2+2 = 2.2

etc.

and once you will have verified that the sentence is true for all possible interpretations, you will be able to say that the sentence is valid, which means that from " n = a+a" one can infer that n = 2.a " validly.

  • But, if you are working with an infinite domain of numbers, the semantic method is no longer available, you cannot check an infinite set of interpretations.

So, you will resort to a syntactic method. That is, you will try to derive the consequent from the antecedent of the conditional using only manipulation of symbols according to syntactic rules.

if n = a+a

then n = 1.a + 1.a = a . ( 1+1) = a.2 = 2.a.

( using : " 1 is the identity element for multiplication" , " distributivity law" and "commutative law for addition").

  • This shows how useful is the syntactic approach ( mechanical manipulation of symbols) . But the question arises : is this syntactif method sound? What proves that actually, in all possible interpretations ( and there is an infinite number of interpretations) "a+a = 2.a" is true? Also, are there formulas that are true in all interpretations, in spite of the fact we cannot prove them using syntactic methods?

  • In propositional logic, you can check the validity of a reasoning using a semantic method ( namely truth tables) , but when the number of atomic sentences is higher that 3 , you are happy to use a syncatic method ( for example natural deduction).

  • So we need formal systems ( but we also need proofs of the fact that they are sound and hopefully complete).

Answered by Floridus Floridi on December 26, 2021

You wrote...

But how did humans develop formal systems and the notion of syntactic consequence in the first place? Wouldn't they have had to develop such systems based on what semantic consequences they agreed with?

... and you are entirely correct: As has been pointed out in the comments, the study of semantic consequence lead to the notion of syntactic consequence. The opening lines of George Boole's The Laws of Thought (the title is itself suggestive) are telling:

  1. The design of the following treatise is to investigate the fundamental laws of those operations of the mind by which reasoning is performed; to give expression to them in the symbolical language of a Calculus, and upon this foundation to establish the science of Logic and construct its method; to make that method itself the basis of a general method for the application of the mathematical doctrine of Probabilities; and, finally, to collect from the various elements of truth brought to view in the course of these inquiries some probable intimations concerning the nature and constitution of the human mind.

The introduction then carries on to briefly discuss the historical development of such an investigation, starting with Aristotle.

Once a system of symbolic logic based on semantic reasoning, i.e. truth-preserving argumentation, has been developed, then that system can be studied in isolation, thus beginning the study of syntactic consequence, where the rules of logical inference become purely mechanical. Put very simply – Leibniz, Babbage and Lovelace were ahead of their time, for instance – the progress was as follows:

  1. People naturally reason with each other.
  2. It is noticed that some arguments are valid while others are not.
  3. Various attempts to analyse point 2 are made.
  4. Symbolic logic based on semantic reasoning is developed.
  5. People discover that symbolic logic, aka syntactic reasoning, is interesting in its own right. There's a lot to this stage, but a key stepping stone on the way was the discovery of non-Euclidean geometries. Peano and Pieri were important figures in the early study of syntactic reasoning for its own sake.
  6. The study of symbolic reasoning leads to modern mathematical logic and also to computability theory (Turing machines and all that).

Symbolic logic is of course crucial to modern logic and set theory, but it is interesting to note that Zermelo came up with his eponymous axioms in 1908, a decade before first-order logic was brought into its current form by Hilbert and Bernays in 1917–1918.

A final note: The development of the study of logic and reasoning is quite similar to that of grammar, which isn't so surprising, considering the connections between the two. An oversimplified account:

  1. People naturally follow unwritten grammatical rules.*
  2. People start to analyse these rules, leading to grammar as a field of study.**
  3. People discover that grammar is interesting in the abstract, eventually leading to formal grammars.***
  4. People realise that formal grammars are rather useful in practical computing.

*Native speakers speak grammatically without having to study grammar.

**It seems so obvious to us now, but coming up with grammatical categories (nouns, verbs, prepositions, etc.) was a tremendous breakthrough.

***This glosses over a lot of the actual historical motivation, e.g. the idea of a universal grammar.

Answered by dwolfeu on December 26, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP