aesthetics  →
being  →
complexity  →
database  →
enterprise  →
ethics  →
fiction  →
history  →
internet  →
knowledge  →
language  →
licensing  →
linux  →
logic  →
method  →
news  →
perception  →
philosophy  →
policy  →
purpose  →
religion  →
science  →
sociology  →
software  →
truth  →
unix  →
wiki  →
essay  →
feed  →
help  →
system  →
wiki  →
critical  →
discussion  →
forked  →
imported  →
original  →
edit index Logic

Logic (λόγος in Greek, logos, "thought") is the most fundamental of all the Sciences and a major branch of Philosophy. Logic is the primary "proof" and method of Mathematics and all Language, leading to Arithmetic Geometry, Set Theory, and Computation, as well as Grammar, Linguistics, Argumentation, and Philosophy of Language. Logic is not only the study of systems of reasoning - guides for how we, and intelligent systems in general, make valid inferences, and of how Statements and Numbers are put together - it is that reasoning. The earliest dinstinct studies of Logic in relation to Philosophy and Mathematics can be found in virtually all Ancient Civilization, from the Egyptians and Greeks, to the Indian and Chinese.

Despite such a daunting introduction, Logic can be understood simply as what makes a good Argument, and which arguments are bad, that is, Fallacy. Increasingly formal descriptions of Natural Language which make up Argument are needed to tie concepts together. In short, the fundamental nature of any example of reasoning can be described by Logic, and this is what it means for something to be "logical". Logic is a wide-ranging, fundamental Science, but can be simplified by considering Deduction, or studying what, if anything, follows logically from a set of given premises, and also Induction, or the study of how we can infer from some number of observed events a reliable generalization. As the basis for all Science, Logic defines the structure of numbers, statements (or propositions), arguments built of numbers and statements, and it devises formulas to codify how they are managed. Logic is very powerful, indeed.

History of Logic

While many ancient cultures employed intricate systems of reasoning and Math, Logic as an explicit analysis of the methods of reasoning received sustained development in three regions: India in the 6th century BC, China in the 5th century BC, and Greece, between the 1st and 4th centuries BC. Our Modern, formal treatment of Logic descends from that early Greek tradition, and mainly from one philosopher - Aristotle. Unfortunately, Indian and Chinese ancient Logic did not survive into the Modern era. The Qin Dynasty in China repressed the tradition of scholarly investigation into Logic, following the Legalism of Han Feizi. In India, innovations in the Scholastic School, called "Nyaya", continued into the early 18th century AD, but did not survive the British Colonial Period. In the Islamic World, the rise of the Asharite School suppressed original work in Logic, despite the Arabs having invented the numbers (0,1,2,3,etc) used today.

During the Medieval Period, after it was shown that Aristotle's ideas were largely compatible with Faith, a greater emphasis was placed on Aristotle's Logic. Logic became a central focus of Medieval Philosophy, which developed ideas on how to engage in critical logical analyses of philosophical arguments. Still, the influence of the three original regions on Logic wasn't lost, and Hermann Weyl (1929) said:

"Occidental mathematics has in past centuries broken away from the Greek view and followed a course which seems to have originated in India and which has been transmitted, with additions, to us by the Arabs; in it the concept of number appears as logically prior to the concepts of geometry."

Aristotelian, Syllogistic Logic

The Organon was Aristotle's body of work on Logic, with the Prior Analytics constituting the first explicit work in Formal Logic, and introduced the Syllogism. Also known as "Term Logic", Syllogisms express inferences as a major and minor premise and a conclusion.

every virtue is laudable
kindness is a virtue
therefore kindness is laudable
Aristotle's work was regarded, even in classical times, but certainly throughout the Middle Ages in Europe and the Middle East, as the very picture of a fully worked out system. The only major rival was the Stoic system of Propositional Logic studied by medieval logicians. Today, Aristotle's system is sometimess seen as little more than a historical study, as the more recent Sentential Logic and Predicate Calculus are more widely employed. Others use Aristotle in Argumentation Theory to help develop and critically question argumentation schemes used in Artificial Intelligence and even legal arguments.

Modal Logic

In languages, Modality means that subparts of a sentence can have their Semantics modified by special verbs, or modal particles. For example, "We edit the Wiki" can be modified to "We should edit the Wiki", "We can edit the Wiki", or "We will edit the Wiki". More abstractly, we might say modality affects the circumstances in which we take an assertion to be satisfied. The connection with Logic again dates back to Aristotle, who was concerned with the Alethic modalities of Necessity and Possibility, which he observed to be dualistic.

  • ⇑ ( &8801; ˜(&9674; ˜()
    (necessarily p is equivalent to not(not p))
  • &9674; ( &8801; ˜(⇑ ˜()
    (''possibly p is equivalent to not(necessarily not p)

and is related to De Morgan's Laws:
  • "It is not possible that X" is logically equivalent to "It is necessary that not X"
  • "It is not necessary that X" is logically equivalent to "It is possible that not X"

While the study of necessity and possibility remained an important commentary to philosophers throughout the Middle Ages, little logical innovation happened until the landmark investigations of Clarence Irving Lewis in 1910, who formulated a family of rival axiomatisations of the alethic modalities. His work unleashed a torrent of new work on the topic, expanding the kinds of modality treated to include Deontic Logic and Epistemic Logic. The seminal work of Arthur Prior applied the same formal language to treat Temporal Logic and paved the way for the marriage of the two subjects. Saul Kripke discovered (contemporaneously with rivals) his theory of Frame Semantics which revolutionised the formal technology available to modal logicians and gave a new Graph Theory methoc of looking at modality that has driven many applications in Computational Linguistics and Computer Science, such as Dynamic Logic.

Formal, Symbolic Logic

Formal, or Symbolic Logic, is concerned primarily with the structure of reasoning. Symbolic Logic deals with the relationships between concepts and provides a way to compose proofs of statements. Concepts are rigorously defined, and sentences are translated into a precise, compact, and unambiguous symbolic notation. For example, a statement which defines p as a true mathematical formula:

p: 1 + 2 = 3
Two propositions can be combined using the operations of Logical Conjunction, Disjunction or Conditional. These are called Binary Operators, and they combine propositions into Compound Propositions. For example:

p: 1 + 1 = 2 and "logic is the study of reasoning"
In Mathematics and Computer Science, one may want to state a proposition depending on some variables:

p: n is an odd integer
This proposition can be either true or false, according to the variable n, and because of this Free Variable, such a Propositional Function with "Domain of Discourse" D. To form an actual statement, one uses Quantifiers, such as, "for every "n"" (Universal), or "for some "n"" (Existential). Statements are specified by Quantifiers, whether Universal or Existential, and so Propositional Logic is extended into Set Theory. For example:

∀ n ∈ D P(n)
(for all n in D, P(n))

Mathematical Logic

Mathematical Logic is the use of Symbolic Logic to study mathematical reasoning. During the early twentieth century, philosophical logicians, starting with Gottlob Frege and Bertrand Russell, proved that Mathematics was "reducible" to Logic - in other words, that Logic was the "parent" of Math, that Logic was the more primary language. This created such a firestorm among mathematicians and logicians that even now mathematicians deny it.(1) At any rate, Logic is now accepted as an accurate and fundamental way to describe mathematical reasoning, or reasoning of any kind. Further, all Number Theory behind Mathematics requires Logic for its proof.(2)

Logicism, also pioneered by Frege and Russell, was an outcome of this Reductibility, and showed that mathematical theories were logical tautologies. The road was rocky though, with Frege's Grundgesetze challenged by Russell%92s Paradox, and Hilbert%92s Program by Gödel%92s Incompleteness Theorem. What was required was the establishment of a new area of Mathematical Logic, the application of Mathematics to Logic (and vice versa) in the form of Proof Theory. Despite the negative effect of the incompleteness theorems, the resulting Proof and Model Theory, another new application, showed that every rigorously defined mathematical theory can be exactly captured by a first-order logical theory. Frege's Proof Calculus describes the whole of Mathematics, though is not equivalent to it. Logic is more primary to Mathematics, not its substitute, and through these developments, Mathematical Logic showed how complementary the two areas of reasoning have been.

Proof Theory and Model Theory have been the foundation of Mathematical Logic, but not the whole story, as Set Theory originated in the study of Infinite Sets by Georg Cantor. This has been the source Cantor%92s Theorem, the Axiom of Choice and the question of the independence of the Continuum Hypothesis, as well as the Modern debate on Large Cardinal Axioms. Further, Recursion Theory captured the idea of Computation in logical and arithmetical terms. The classical achievements here were the Undecidability of the Computability Problem by Alan Turing, and his presentation of the Church-Turing Thesis. Today, Recursion Theory is also concerned with Complexity Classes, in studying when a problem is Efficiently Solvable. This involves the Turing Degrees of Unsolvability.

Computation and Computability

Alan Turing's work on the Computability Problem followed from Kurt Gödel's work on the Incompleteness Theorems, and the notion of general purpose computers which came from all of this was of fundamental importance to the designers of early computer machinery in the 1940s. Throughout the 1950s and 1960s, researchers predicted that when human knowledge could be expressed using Logic and mathematical notation, it would be possible to create a reasoning machine, an Artificial Intelligence. In Logic Programming, a program consists of a set of axioms and rules. Logic Programming systems, such as Prolog, compute the consequences of axioms and rules in order to answer a query. Artificial Intelligence, though, has turned out to be a bit more difficult to achieve than expected, due to the more recently discovered complexities of human reasoning.

The ACM Computing Classification System is a good example of how Argumentation Theory is being applied to Artificial Intelligence. For example, the system specifies:
  • Section F.3 on Logics and Meanings of Programs, and F. 4 on Mathematical Logic and Formal Languages, as part of the theory of Computer Science (this work covers a Formal Semantics of Programming Languages, as well as Formal Methods)
  • Boolean Logic as fundamental to computer hardware, particularly, the system's section B.2 on Arithmetic and Logic Structures
  • Many fundamental Logical Formalisms are essential to section I.2 on Artificial Intelligence, for example Modal Logic and Default Logic in Knowledge Representation Formalisms and Methods, Horn Clauses in Logic Programming, and Description Logic

Put simply, Computer Science, as it emerged as a discipline, was unthinkable without Logic. Not only can proofs by humans can be computer-assisted, but automated theorem proofs by machines can find and check proofs as they are generated. Computers represent proofs too lengthy to be written out by hand.

Predicate Logic

Gottlob Frege, in his Begriffsschrift, discovered a way to rearrange many sentences to make their logical form clear, to show how sentences relate to one another in certain respects. Frege's work started contemporary Symbolic Logic. Prior to Frege, Symbolic Logic had not been successful beyond the level of Sentential Logic. It could represent the structure of sentences composed of other sentences using such words as "and", "or", and "not," but it could not break sentences down into smaller parts. Without "predicates", Symbolic Logic could not show how "cows are animals" entails "parts of cows are parts of animals."

Frege expanded Sentential Logic to include words such as "all", "some", and "none", and showed how we can introduce Variables and "Quantifiers" to rearrange sentences:
  • "all humans are mortal" becomes "all things x are such that, if x is a human then x is mortal"

∀ x (H(x)⇒ M(x))
  • "some humans are vegetarian" becomes "there exists some (at least one) thing x such that x is human and x is vegetarian"

eξsts x (H(x)∧ V(x))
Frege treated simple sentences without subject nouns as predicates, and applied them to "dummy objects" (x). The logical structure in discourse about objects can then be operated on according to the rules of Sentential Logic, with some additional details for adding and removing quantifiers. So, Frege added (1) the vocabulary of Quantifiers (
) and Variables (x, y, etc), (2) a Semantics to explain that Variables denote individual objects and Quantifiers have the force of "all" or "some" Sets in relation to those objects, and (3) methods for using these in Language. To introduce
, you assume an arbitrary variable, x, prove something that must hold true of it, and then prove that it didn't matter which variable you chose, for the proposition would have held true for all of them. An "All" quantifier can be removed by applying the sentence to any particular object at all.
can be added to a sentence true of any object at all, or removed, in favor of a term about which you are not already presupposing any information. In other words, we can claim properties true of all objects in our set, or show one or more exceptions.

Philosophy of Logic

Philosophical Logic (or the Philosophy of Logic) is closely related to Mathematical Logic, and is concerned with the elucidation of ideas such as Reference, Predication, Identity, Truth, Quantification, Existence, and others. Philosophical Logic, though, has a much greater concern with the connection between Natural Language and Logic. Such close connections are similarly found with Mentation, or how we reason, studied in the Philosophy of Mind, Cognitive Psychology and Neuroscience.

The bulk of "normal" reasoning in which we engage can be captured by Logic, if one can only find the right method for translating our Ordinary Language into Logic. As a result, philosophical logicians have contributed a great deal to the development of non-standard "Logics" (e.g., Free logic, Tense Logic, and so on) as well as various extensions of Classical Logic (e.g., Modal Logic), and non-standard Semantics for such Logics (e.g., Kripke's technique of Supervaluations in the Semantics of Logic).(3)

Consistency, Completeness, Soundness

There are three valuable properties Formal Systems can have:
  • Consistency: None of the theorems of a system contradict.
  • Soundness: The system's rules of derivation will never let you infer anything false, so long as you start with only true premises. So if a system is sound (and its axioms, if any, are true), then the theorems of a sound formal system are the Truths. All of the theorems of a system which has no axioms are its Truths, sometimes called "Logical Truths". A Corollary: A system that is not consistent cannot be sound.
  • Completeness: There are no true sentences in the system that cannot, at least in principle, be proved using the derivation rules (and axioms, if any) of the system.

Not all systems, whether thinking machines, natural languages, or other constructs, achieve all three virtues. It has been proven by Kurt Gödel that a system with enough axioms and/or rules of derivation to derive the principles of arithmetic, cannot be both consistent and complete at the same time (see Gödel%92s Incompleteness Theorems).

Multi-Valued Logic(s)

The Logics discussed above are all "bivalent" or "two-valued" Logics. The Semantics for each of them will assign to every sentence either the value "True" or the value "False." Systems which do not always rely on this distinction are known as Multi-Valued Logics, or sometimes, "non-Aristotelian" Logics. In the early 20th century, Jan Lukasiewicz investigated the extension of the traditional true/false values to include a third value, "possible". Logics, such as Fuzzy Logic, have since been devised with an infinite number of "degrees of truth". Fuzzy values are represented by a Real Number between Zero and One.(4)

Further Reading


  1. This is partially evidenced by the Wikipedia version of this page, strongly slanted toward the pro-mathematical view.
  2. 1 + 1 = 2 is a mathematical proposition which requires logical proof, despite the fact that few would question it on face value.
  3. Philosophy of Language is closely related, and has to do with the study of how our Language engages and interacts with our thinking, or vice versa.
  4. Bayesian Probability, for example, can be interpreted as a system of Logic where Probability itself is the Subjective Truth Value.

Some content may adapted from the Wikinfo article "Logic" and Pseudopedia article "Logic" under the GNU Free Documentation License.
edit index
[ last updated: 11:07pm EDT - Thu, Jul 16 2009 ]
LATEST EDITS [ see all ]
Eastern Philosophy
History of Philosophy
M.R.M. Parrott