15-399 Constructive Logic
Overview
The material in this class can be roughly divided into three parts.
- Constructive Logic
- Proofs and Programs
- Decidable Fragments
We review the contents of each part in turn.
1. Constructive Logic
The underlying principle of our development of logic (and later type
theory) is the separation of judgments from propositions. A judgment is
an object of knowledge, while an evident judgment is something we know.
Examples of judgments are ``A is a proposition'' and ``A is true''. The
meaning of a proposition is determined by what counts as a verification
of it. These consideration lead us to introduction and elimination
rules for logical connectives, plus the notion of local reduction which
verifies that the elimination rules are sound with respect to the
introduction rules.
For the purpose of guiding proof search we further introduce the
notion of normal deduction. Roughly, a proof is normal if it can be
found by using only introduction rules from below (from the goal) and
only elimination rules from above (the assumptions). Normal proofs are
complete for the propositional fragment (every true proposition has a
normal proof). As an application, we can show the independence of
certain schematic propositions such as the so-called law of excluded
middle.
A common alternative meaning explanation is via notational definitions.
We use this, for example, to define negation and equivalence of propositions.
As a final, purely logical concept we consider universal and
existential quantification over elements of an unspecified domain. This
yields first-order intuitionistic logic. If we also assume the law of
excluded middle (which cannot be justified using our machinery of
judgments), then we obtain classical logic in which every proposition is
assumed to be true or false. This destroys the constructive nature
of proofs which is subject of the second part of the course.
2. Proofs and Programs
We introduce proof terms to directly express the evidence for the
truth of a proposition. Proofs terms constitute a small functional
programming languages, in which propositions play the role of types.
This correspondence is known as the Curry-Howard isomorphism between
proofs and programs. The basis for computation are local reductions,
expressed as reduction rules on proof terms.
Under this interpretation, implication corresponds to function types,
conjunction to product types, and disjunction to disjoint sum types.
Thus we have strong logical guidance in the design of programming
languages and type systems, which is the first major application of
logic in computer science we consider.
To write interesting programs we need specific data types, such as
Booleans, natural numbers, and lists. These can be introduced via
introduction and elimination rules, following precisely the same
methodology as in our development of logic. Elimination rules takes the
form of schemas for definition by cases or primitive recursion
(elimination on types) and proof by cases or induction (elimination on
propositions).
We briefly investigate arithmetic (the theory of natural numbers),
but then generalize to other structure data types and discuss various
programming concepts. This includes reasoning about data representation
(exemplified by an implementation of numbers as bit strings),
general recursion (related to complete induction), and reasoning
about data structure invariants (captured by dependent types).
With dependent types we have essentially replicated the full structure
of propositions at the level of types. In order to avoid excessive
redundancy, those concepts are therefore often collapsed, which can be
justified by the correspondence between proofs and programs. In
general, we see that proofs concerning data types contain
computationally uninteresting part, and we show how to consistently
eliminating those parts, thereby contracting proofs to programs.
3. Decidable Fragments
There is no terminating algorithm which can decide whether a given
proposition of first-order logic or arithmetic is true, even if we
restrict ourselves to a certain formal system. This did not directly
prohibit our applications in programming languages, since the problem of
proof checking (or the related type-checking) is decidable by design.
However, it complicates other applications of logic where problems
are translated into logic to be solved by theorem proving.
We therefore conclude the course by considering two interesting
decidable fragments that have found numerous practical applications.
The first is the theory of Booleans. The elimination rule here is not
induction, but simply proof by cases. The theory of Booleans with
equality, where AND, OR, NOT, FORALL, EXISTS are given directly by
notational definitions, turns out to be a decidable fragment of type
theory. This includes in particular the problem SAT (Boolean
satisfiability) which is NP-complete. Nonetheless, many practical
problems can be mapped to SAT and solved efficiently, such as k-coloring
of graphs, state-space planning, or Hamiltonian circuits. The
straightforward implementation in type theory, however, is too
inefficient so we introduce ordered binary decision diagrams (OBDDs)
that are frequently used in practice.
The second is the theory of temporal properties of finite state
systems. Part of this theory can be developed easily in type theory,
but for the full computation tree logic (CTL) it is an open question how
to accomplish this effectively from first principles. Again, under the
assumption of excluded middle we develop a model-checking algorithm that
decides, for a given formula and finite state system (= model), if the
formula is true on the model. We considered this for the temporal
operators X (next), G (globally), F (eventually), and U (until) each
considered along all (A) or some (E) computation paths. The
model-checking algorithm can be implemented by using OBDDs, representing
both the states and transition relation by Boolean functions.
[ Home
| Schedule
| Assignments
| Handouts
| Software
| Overview
]
fp@cs
Frank Pfenning
|