Default logic is a non-monotonic logic proposed by Raymond Reiter to formalize reasoning with default assumptions.
Default logic can express facts like “by default, something is true”; by contrast, standard logic can only express that something is true or that something is false. This is a problem because reasoning often involves facts that are true in the majority of cases but not always. A classical example is: “birds typically fly”. This rule can be expressed in standard logic either by “all birds fly”, which is inconsistent with the fact that penguins do not fly, or by “all birds that are not penguins and not ostriches and ... fly”, which requires all exceptions to the rule to be specified. Default logic aims at formalizing inference rules like this one without explicitly mentioning all their exceptions.
A default theory is a pair . W is a set of logical formulas, called the background theory, that formalize the facts that are known for sure. D is a set of default rules, each one being of the form:
According to this default, if we believe that Prerequisite is true, and each for is consistent with our current beliefs, we are led to believe that Conclusion is true.
The logical formulae in W and all formulae in a default were originally assumed to be first-order logic formulae, but they can potentially be formulae in an arbitrary formal logic. The case in which they are formulae in propositional logic is one of the most studied.
The default rule “birds typically fly” is formalized by the following default:
This rule means that, "if X is a bird, and it can be assumed that it flies, then we can conclude that it flies". A background theory containing some facts about birds is the following one:
According to this default rule, a condor flies because the precondition Bird(Condor) is true and the justification Flies(Condor) is not inconsistent with what is currently known. On the contrary, Bird(Penguin) does not allow concluding Flies(Penguin): even if the precondition of the default Bird(Penguin) is true, the justification Flies(Penguin) is inconsistent with what is known. From this background theory and this default, Bird(Bee) cannot be concluded because the default rule only allows deriving Flies(X) from Bird(X), but not vice versa. Deriving the antecedents of an inference rule from the consequences is a form of explanation of the consequences, and is the aim of abductive reasoning.
A common default assumption is that what is not known to be true is believed to be false. This is known as the Closed-World Assumption, and is formalized in default logic using a default like the following one for every fact F.
For example, the computer language Prolog uses a sort of default assumption when dealing with negation: if a negative atom cannot be proved to be true, then it is assumed to be false. Note, however, that Prolog uses the so-called negation as failure: when the interpreter has to evaluate the atom , it tries to prove that F is true, and conclude that is true if it fails. In default logic, instead, a default having as a justification can only be applied if is consistent with the current knowledge.
A default is categorical or prerequisite-free if it has no prerequisite (or, equivalently, its prerequisite is tautological). A default is normal if it has a single justification that is equivalent to its conclusion. A default is supernormal if it is both categorical and normal. A default is seminormal if all its justifications entail its conclusion. A default theory is called categorical, normal, supernormal, or seminormal if all defaults it contains are categorical, normal, supernormal, or seminormal, respectively.
A default rule can be applied to a theory if its precondition is entailed by the theory and its justifications are all consistent with the theory. The application of a default rule leads to the addition of its consequence to the theory. Other default rules may then be applied to the resulting theory. When the theory is such that no other default can be applied, the theory is called an extension of the default theory. The default rules may be applied in different order, and this may lead to different extensions. The Nixon diamond example is a default theory with two extensions:
Since Nixon is both a Republican and a Quaker, both defaults can be applied. However, applying the first default leads to the conclusion that Nixon is not a pacifist, which makes the second default not applicable. In the same way, applying the second default we obtain that Nixon is a pacifist, thus making the first default not applicable. This particular default theory has therefore two extensions, one in which Pacifist(Nixon) is true, and one in which Pacifist(Nixon) is false.
The original semantics of default logic was based on the fixed point of a function. The following is an equivalent algorithmic definition. If a default contains formulae with free variables, it is considered to represent the set of all defaults obtained by giving a value to all these variables. A default is applicable to a propositional theory T if and all theories are consistent. The application of this default to T leads to the theory . An extension can be generated by applying the following algorithm:
T = W /* current theory */ A = 0 /* set of defaults applied so far */ /* apply a sequence of defaults */ while there is a default d that is not in A and is applicable to T add the consequence of d to T add d to A /* final consistency check */ iffor every default d in A T is consistent with all justifications of d then output T
This algorithm is non-deterministic, as several defaults can alternatively be applied to a given theory T. In the Nixon diamond example, the application of the first default leads to a theory to which the second default cannot be applied and vice versa. As a result, two extensions are generated: one in which Nixon is a pacifist and one in which Nixon is not a pacifist.
The final check of consistency of the justifications of all defaults that have been applied implies that some theories do not have any extensions. In particular, this happens whenever this check fails for every possible sequence of applicable defaults. The following default theory has no extension:
Since is consistent with the background theory, the default can be applied, thus leading to the conclusion that is false. This result however undermines the assumption that has been made for applying the first default. Consequently, this theory has no extensions.
In a normal default theory, all defaults are normal: each default has the form . A normal default theory is guaranteed to have at least one extension. Furthermore, the extensions of a normal default theory are mutually inconsistent, i.e., inconsistent with each other.
A default theory can have zero, one, or more extensions. Entailment of a formula from a default theory can be defined in two ways:
Thus, the Nixon diamond example theory has two extensions, one in which Nixon is a pacifist and one in which he is not a pacifist. Consequently, neither Pacifist(Nixon) nor ¬Pacifist(Nixon) are skeptically entailed, while both of them are credulously entailed. As this example shows, the credulous consequences of a default theory may be inconsistent with each other.
The following alternative inference rules for default logic are all based on the same syntax as the original system.
The justified and constrained versions of the inference rule assign at least an extension to every default theory.
The following variants of default logic differ from the original one on both syntax and semantics.
Default theories can be translated into theories in other logics and vice versa. The following conditions on translations have been considered:
Translations are typically required to be faithful or at least consequence-preserving, while the conditions of modularity and same alphabet are sometimes ignored.
The translatability between propositional default logic and the following logics have been studied:
Translations exist or not depending on which conditions are imposed. Translations from propositional default logic to classical propositional logic cannot always generate a polynomially sized propositional theory, unless the polynomial hierarchy collapses. Translations to autoepistemic logic exist or not depending on whether modularity or the use of the same alphabet is required.
The computational complexity of the following problems about default logic is known:
Four systems implementing default logics are DeReS [ permanent dead link ], XRay, GADeL Archived 2007-04-06 at the Wayback Machine , and Catala.
In artificial intelligence, with implications for cognitive science, the frame problem describes an issue with using first-order logic to express facts about a robot in the world. Representing the state of a robot with traditional first-order logic requires the use of many axioms that simply imply that things in the environment do not change arbitrarily. For example, Hayes describes a "block world" with rules about stacking blocks together. In a first-order logic system, additional axioms are required to make inferences about the environment. The frame problem is the problem of finding adequate collections of axioms for a viable description of a robot environment.
Propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. It deals with propositions and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives. Propositions that contain no logical connectives are called atomic propositions.
In propositional logic and Boolean algebra, De Morgan's laws, also known as De Morgan's theorem, are a pair of transformation rules that are both valid rules of inference. They are named after Augustus De Morgan, a 19th-century British mathematician. The rules allow the expression of conjunctions and disjunctions purely in terms of each other via negation.
In Boolean logic, a formula is in conjunctive normal form (CNF) or clausal normal form if it is a conjunction of one or more clauses, where a clause is a disjunction of literals; otherwise put, it is a product of sums or an AND of ORs. As a canonical normal form, it is useful in automated theorem proving and circuit theory.
In traditional logic, a contradiction occurs when a proposition conflicts either with itself or established fact. It is often used as a tool to detect disingenuous beliefs and bias. Illustrating a general tendency in applied logic, Aristotle's law of noncontradiction states that "It is impossible that the same thing can at the same time both belong and not belong to the same object and in the same respect."
In logic, negation, also called the logical not or logical complement, is an operation that takes a proposition to another proposition "not ", standing for " is not true", written , or . It is interpreted intuitively as being true when is false, and false when is true. Negation is thus a unary logical connective. It may be applied as an operation on notions, propositions, truth values, or semantic values more generally. In classical logic, negation is normally identified with the truth function that takes truth to falsity. In intuitionistic logic, according to the Brouwer–Heyting–Kolmogorov interpretation, the negation of a proposition is the proposition whose proofs are the refutations of .
Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems of intuitionistic logic do not assume the law of the excluded middle and double negation elimination, which are fundamental inference rules in classical logic.
In mathematics, constructive analysis is mathematical analysis done according to some principles of constructive mathematics.
In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology instead of an unconditional tautology. Each conditional tautology is inferred from other conditional tautologies on earlier lines in a formal argument according to rules and procedures of inference, giving a better approximation to the natural style of deduction used by mathematicians than to David Hilbert's earlier style of formal logic, in which every line was an unconditional tautology. More subtle distinctions may exist; for example, propositions may implicitly depend upon non-logical axioms. In that case, sequents signify conditional theorems in a first-order language rather than conditional tautologies.
A paraconsistent logic is an attempt at a logical system to deal with contradictions in a discriminating way. Alternatively, paraconsistent logic is the subfield of logic that is concerned with studying and developing "inconsistency-tolerant" systems of logic, which reject the principle of explosion.
Computation tree logic (CTL) is a branching-time logic, meaning that its model of time is a tree-like structure in which the future is not determined; there are different paths in the future, any one of which might be an actual path that is realized. It is used in formal verification of software or hardware artifacts, typically by software applications known as model checkers, which determine if a given artifact possesses safety or liveness properties. For example, CTL can specify that when some initial condition is satisfied, then all possible executions of a program avoid some undesirable condition. In this example, the safety property could be verified by a model checker that explores all possible transitions out of program states satisfying the initial condition and ensures that all such executions satisfy the property. Computation tree logic belongs to a class of temporal logics that includes linear temporal logic (LTL). Although there are properties expressible only in CTL and properties expressible only in LTL, all properties expressible in either logic can also be expressed in CTL*.
In proof theory, the semantic tableau is a decision procedure for sentential and related logics, and a proof procedure for formulae of first-order logic. An analytic tableau is a tree structure computed for a logical formula, having at each node a subformula of the original formula to be proved or refuted. Computation constructs this tree and uses it to prove or refute the whole formula. The tableau method can also determine the satisfiability of finite sets of formulas of various logics. It is the most popular proof procedure for modal logics.
Belief revision is the process of changing beliefs to take into account a new piece of information. The logical formalization of belief revision is researched in philosophy, in databases, and in artificial intelligence for the design of rational agents.
Negation as failure is a non-monotonic inference rule in logic programming, used to derive from failure to derive . Note that can be different from the statement of the logical negation of , depending on the completeness of the inference algorithm and thus also on the formal logic system.
The closed-world assumption (CWA), in a formal system of logic used for knowledge representation, is the presumption that a statement that is true is also known to be true. Therefore, conversely, what is not currently known to be true, is false. The same name also refers to a logical formalization of this assumption by Raymond Reiter. The opposite of the closed-world assumption is the open-world assumption (OWA), stating that lack of knowledge does not imply falsity. Decisions on CWA vs. OWA determine the understanding of the actual semantics of a conceptual expression with the same notations of concepts. A successful formalization of natural language semantics usually cannot avoid an explicit revelation of whether the implicit logical backgrounds are based on CWA or OWA.
In mathematical logic, Heyting arithmetic is an axiomatization of arithmetic in accordance with the philosophy of intuitionism. It is named after Arend Heyting, who first proposed it.
Circumscription is a non-monotonic logic created by John McCarthy to formalize the common sense assumption that things are as expected unless otherwise specified. Circumscription was later used by McCarthy in an attempt to solve the frame problem. To implement circumscription in its initial formulation, McCarthy augmented first-order logic to allow the minimization of the extension of some predicates, where the extension of a predicate is the set of tuples of values the predicate is true on. This minimization is similar to the closed-world assumption that what is not known to be true is false.
The autoepistemic logic is a formal logic for the representation and reasoning of knowledge about knowledge. While propositional logic can only express facts, autoepistemic logic can express knowledge and lack of knowledge about facts.
Epistemic modal logic is a subfield of modal logic that is concerned with reasoning about knowledge. While epistemology has a long philosophical tradition dating back to Ancient Greece, epistemic logic is a much more recent development with applications in many fields, including philosophy, theoretical computer science, artificial intelligence, economics and linguistics. While philosophers since Aristotle have discussed modal logic, and Medieval philosophers such as Avicenna, Ockham, and Duns Scotus developed many of their observations, it was C. I. Lewis who created the first symbolic and systematic approach to the topic, in 1912. It continued to mature as a field, reaching its modern form in 1963 with the work of Kripke.
Markov's principle, named after Andrey Markov Jr, is a conditional existence statement for which there are many equivalent formulations, as discussed below.