1% this is both a latex file and a quintus prolog file
    2% the only difference is that the latex version comments out the following
    3% line:
    4
    5:- module(snark_theorist,[]).    6
    7
    8/* 
    9\documentstyle[12pt,makeidx]{article}
   10\pagestyle{myheadings}
   11\markright{Theorist to Prolog (th2.tex)}
   12\makeindex
   13\newtheorem{example}{Example}
   14\newtheorem{algorithm}{Algorithm}
   15\newtheorem{theorem}{Theorem}
   16\newtheorem{lemma}[theorem]{Lemma}
   17\newtheorem{definition}{Definition}
   18\newtheorem{corollary}[theorem]{Corollary}
   19\newenvironment{proof}{\begin{quote} {\bf Proof: }}{$\Box$\end{quote}}
   20\newenvironment{prolog}{\begin{tabbing} \hbox{2cm}\=\hbox{1cm}\=\kill
   21    }{\end{tabbing}}
   22\newcommand{\IF}{\ $:-$\\\>}
   23\newcommand{\AND}{,\\\>}
   24\title{A Theorist to Prolog Compiler (th2.tex)\thanks{Copyright \copyright 1990
   25David Poole. All rights reserved.}}
   26\author{David Poole\\
   27Department of Computer Science,\\
   28University of British Columbia,\\
   29Vancouver, B.C. Canada V6T 1W5
   30(604) 228-6254\\
   31poole@cs.ubc.ca}
   32\begin{document}
   33\maketitle
   34\begin{abstract}
   35Artificial intelligence researchers have been designing representation
   36systems for default and abductive reasoning.
   37Logic Programming researchers have been working on techniques to improve
   38the efficiency of Horn Clause deduction systems.
   39This paper describes how {\em Theorist\/} can be
   40translated into Quintus Prolog.
   41The verbatim code is the actual running code.
   42
   43This should not be seen as {\em The Theorist System}, but rather
   44as one implementation of the Theorist framework. Theorist should be
   45seen as the idea that much reasoning can be done by theory formation
   46from a fixed th_set of possible hypotheses.
   47This implementation is based on a complete linear resolution theorem prover
   48which does not multiply out subterms. It also carries out incremental
   49consistency checking.
   50Note that there is nothing in this document which forces the control strategy
   51of Prolog. This is a compiler from Theorist to a Horn-clause reasoner
   52with negation as failure; nothing precludes any other search strategy
   53(e.g., dependency directed backtracking, constraint propagation).
   54This is intended to be a runnable specification, which runs fast
   55(e.g., for the common intersection between Theorist and Prolog (i.e., Horn
   56clauses) Theorist code runs about half the speed of compiled Quintus
   57Prolog code).
   58
   59This code is available electronically from the author.
   60\end{abstract}
   61\tableofcontents
   62\section{Introduction}
   63Many people in Artificial Intelligence have been working on default reasoning
   64and abductive diagnosis systems 
   65\cite{reiter80,mccarthy86,cox87,poole:lf}. The systems implemented so far
   66(eg., \cite{brewka86,lifschitz85,ginsberg87,pga}) are only prototypes or
   67have been developed in ways that cannot take full advantage in the
   68advances of logic programming implementation technology.
   69
   70Many people are working on making logic programming systems more efficient.
   71These systems, however, usually assume that the input is in the form of
   72Horn clauses with negation as failure. This paper shows how to implement
   73the default reasoning system Theorist \cite{poole:lf,pga}
   74by compiling its input into Horn clauses with negation as failure, thereby
   75allowing direct
   76use the advances in logic programming implementation technology.
   77Both the compiler and the compiled code can take advantage of 
   78these improvements.
   79
   80We have been running this implementation on standard
   81Prolog compilers (in particular Quintus Prolog) and it outperforms
   82all other default reasoning systems that the author is aware of.
   83It is, however, not restricted to the control structure of Prolog. There is
   84nothing in the compiled code which forces it to use Prolog's
   85search strategy.
   86Logic programming and other researchers are working on alternate
   87control structures which seem very appropriate for default 
   88and abductive reasoning.
   89Advances in parallel inference (e.g.,\ \cite{pie}),
   90constraint satisfaction \cite{dincbas,vanh} and dependency directed backtracking
   91\cite{dekleer86,doyle79,cox82} 
   92should be able to be directly applicable to the code produced by this compiler.
   93
   94We are thus effecting a clear
   95distinction between the control and logic of our default reasoning systems
   96\cite{kowalski}. We can let the control people concentrate on efficiency
   97of Horn clause systems, and these will then be directly applicable to
   98those of us building richer representation systems.
   99The Theorist system has been designed to allow maximum flexibility in
  100control strategies while still giving us the power of assumption-based
  101reasoning that are required for default and abductive reasoning.
  102
  103This is a step towards having representation and reasoning systems
  104which are designed for correctness being able to use the most
  105efficient of control
  106strategies, so we can have the best of expressibility and efficiency.
  107\section{Theorist Framework} \label{theorist}
  108
  109Theorist \cite{poole:lf,pga} is a logical reasoning system for default reasoning
  110and diagnosis. It is based on the idea of theory formation from a fixed
  111th_set of possible hypotheses.
  112
  113This implementation is of the version of Theorist described in \cite{poole:lf}.
  114The user provides three sets of first order formulae
  115\begin{itemize}
  116\item[${\cal F}$] is a th_set of closed formulae called the {\em facts\/}.
  117These are intended to be true in the world being modelled.
  118\item[$\Delta$] is a th_set of formulae which
  119act as {\em possible hypotheses}, any ground instance of which
  120can be used in an explanation if consistent.
  121\item[$\cal C$] is a th_set of closed formulae taken as constraints.
  122The constraints restrict what can be hypothesised.
  123\end{itemize}
  124
  125We assume that ${\cal F}\cup C$ is consistent.
  126
  127\begin{definition} \em
  128a {\bf  scenario} of ${\cal F},\Delta$ is a th_set $D \cup {\cal F}$ where
  129$D$ is a th_set of ground instances of elements
  130of $\Delta$ such that $D\cup {\cal F} \cup C$ is consistent.
  131\end{definition}
  132
  133\begin{definition} \em
  134If $g$ is a closed formula then an {\bf explanation} of $g$ from ${\cal F},\Delta$
  135is a  scenario of ${\cal F},\Delta$ which implies $g$.
  136\end{definition}
  137That is, $g$ is explainable from ${\cal F},\Delta$ if there is a th_set
  138$D$ of ground instances of elements of $\Delta$ such that
  139\begin{quote}
  140${\cal F} \cup D \models g$ and\\
  141${\cal F} \cup D \cup C$ is consistent
  142\end{quote}
  143${\cal F} \cup D$ is an explanation of $g$.
  144
  145In other papers we have described how this can be the basis of
  146default and abductive reasoning systems \cite{pga,poole:lf,poole:dc,poole:dd}.
  147If we are using this for prediction then possible hypotheses can be seen
  148as defaults. \cite{poole:lf} describes how this formalism can account
  149for default reasoning. This is also a framework for abductive reasoning
  150where the possible hypotheses are the base causes we are prepared
  151to accept as to why some observation was made \cite{pga}.
  152We will refer to possible hypotheses as defaults.
  153
  154One restriction that can be made with no loss of expressive power
  155is to restrict possible hypotheses to just atomic forms with no
  156structure \cite{poole:lf}. This is done by naming the defaults.
  157\subsection{Syntax} \label{syntax}
  158The syntax of Theorist is designed to be of maximum flexibility.
  159Virtually any syntax is appropriate for wffs; the formulae are translated
  160into Prolog clauses without mapping out subterms. The theorem
  161prover implemented in the Compiler can be seen as a non-clausal theorem
  162prover \cite{poole:clausal}.
  163
  164A {\em wff\/} is a well formed formula made up of arbitrary combination of
  165equivalence (``=='', ``$equiv$''),
  166implication (``$=>$'', ``$<-$''), disjunction (``$or$'', ``;''),
  167conjunction (``$and$'', ``$\&$'', ``,'') and negation (``$not$'', ``\~{}'')
  168of atomic symbols. Variables follow the Prolog convention
  169of being in upper case. There is no explicit quantification.
  170
  171A {\em name\/} is an atomic symbol with only free variables as arguments.
  172
  173The following gives the syntax of the Theorist code:
  174\begin{description}
  175\item[\bf fact]
  176$w.$\\
  177where $w$ is a wff,
  178means that $(\forall w) \in {\cal F}$; i.e., the universal closure of $w$ (all
  179variables universally quantified) is a fact.
  180\item[\bf default]
  181$d.$\\
  182where $d$ is a name,
  183means that $d\in \Delta$; i.e., $d$ is a default (a possible hypothesis).
  184\item[\bf default]
  185$d:w.$\\
  186where $d$ is a name and $w$ is a wff means $w$, with name $d$ can
  187be used in a scenario if it is consistent.
  188Formally it means $d\in  \Delta$ and
  189$(\forall d\Rightarrow w) \in {\cal F}$.
  190\item[\bf constraint]
  191$w.$\\
  192where $w$ is a wff means $\forall w\in \cal C$.
  193\item[\bf prolog]
  194$p.$\\
  195where $p$ is an atomic symbol means any Theorist call to $p$ should
  196be proven in Prolog. This allows us to use built-in predicates of pure Prolog.
  197One should not expect Prolog's control predicates to work.
  198\item[\bf explain]
  199$w.$\\
  200where $w$ is an arbitrary wff,
  201gives all explanations of $\exists w$.
  202\item[\bf predict]
  203$w.$\\
  204where $w$ is a arbitrary ground wff,
  205returns ``yes'' if $w$ is in every extension of the defaults
  206and ``no'' otherwise.
  207If it returns ``yes'', a th_set of explanations is returned, if
  208it returns ``no'' then a scenario from which $g$ cannot be explained is
  209returned (this follows the framework of \cite{poole:dc}).
  210
  211\end{description}
  212
  213In this compiler these are interpreted as commands to Prolog.
  214The interface will thus be the Prolog interface with some predefined
  215commands.
  216
  217\subsection{Compiler Directives}
  218The following are compiler directives:
  219\begin{description}
  220\item[\bf thconsult] {\em filename.}\\
  221reads commands from {\em filename}, and asserts and/or executes them.
  222\item[\bf thtrans] {\em filename.}\\
  223reads commands from {\em filename} and translates them into Prolog
  224code in the file {\em filename.pl}.
  225\item[\bf thcompile] {\em filename.}\\
  226reads commands from {\em filename}, translates them into the file
  227{\em filename.pl} and then compiles this file. ``explain'' commands in
  228the file are not interpreted.
  229\item[\bf dyn] {\em atom.}\\
  230should be in a file and declares that anything matching the atom
  231is allowed to be asked or added to. This should appear before any
  232use of the atom. This corresponds to the ``dynamic'' declaration of
  233Quintus Prolog. This is ignored except when compiling a file.
  234\end{description}
  235There are some other commands which allow one to th_set flags. See section
  236\ref{flags} for more detail on setting checking and resetting flags.
  237
  238\section{Overview of Implementation}
  239In this section we assume that we have a deduction system 
  240(denoted $\vdash$) which has the 
  241following properties (such a deduction system will be defined in the
  242next section):
  243\begin{enumerate}
  244\item It is sound (i.e., if $A\vdash g$ then $A\models g$).
  245\item It is complete in the sense that if $g$ follows from a consistent
  246th_set of formulae, then $g$ can be proven. I.e., if $A$ is consistent and
  247$A\models g$ then $A\vdash g$.
  248\item If $A\vdash g$ then $A\cup B\vdash g$; i.e., adding in extra facts will
  249not prevent the system from finding a proof which previously existed.
  250\item It can return instances of predicates which were used in the proof.
  251\end{enumerate}
  252
  253The basic idea of the implementation follows the definition on explainability:
  254\begin{algorithm}\em \label{basic-alg}
  255to explain $g$
  256\begin{enumerate}
  257\item try to prove $g$ from ${\cal F}\cup \Delta$. If no proof exists, then
  258$g$ is not explainable. If there is a proof, let $D$ be the th_set of instances of
  259elements of $\Delta$ used in the proof. We then know
  260\[{\cal F}\cup D \models g\]
  261by the soundness of our proof procedure.
  262\item show $D$ is consistent with ${\cal F}\cup C$
  263by failing to prove it is inconsistent.
  264\end{enumerate}
  265\end{algorithm}
  266
  267\subsection{Consistency Checking}
  268The following two theorems are important for implementing the consistency
  269check:
  270\begin{lemma} \label{incremantal}
  271If $A$ is a consistent th_set of formulae and
  272$D$ is a finite th_set of ground instances of possible hypotheses, then
  273if we impose arbitrary ordering on the elements of $D=\{d_1,...,d_n\}$
  274\begin{center}
  275$A\cup D$ is inconsistent\\if and only if\\
  276there is some $i$, $1\leq i \leq n$ such that
  277$A\cup \{d_1,...,d_{i-1}\}$ is consistent and\\
  278$A\cup \{d_1,...,d_{i-1}\}\models \neg d_i$.
  279\end{center}
  280\end{lemma}
  281\begin{proof}
  282If $A \cup D $ is inconsistent there is some least $i$ such
  283that $A\cup \{d_1,...,d_i\}$ is inconsistent. Then we must have
  284$A\cup \{d_1,...,d_{i-1}\}$ is consistent (as $i$ is minimal) and
  285$A\cup \{d_1,...,d_{i-1}\}\models \neg d_i$ (by inconsistency).
  286\end{proof}
  287
  288This lemma says that we can show that ${\cal F}\cup C \cup \{d_1,...,d_n\}$ is 
  289consistent if we can show that for all $i$, $1\leq i \leq n$,
  290${\cal F}\cup C\cup \{d_1,...,d_{i-1}\}\not\vdash \neg d_i$.
  291If our theorem prover can show there is no proof of all of
  292the $\neg d_i$, then we have consistency.
  293
  294This lemma indicates that we can implement Theorist by incrementally failing to
  295prove inconsistency. We need to try to prove the negation of the
  296default in the context of all previously assumed defaults.
  297Note that this ordering is arbitrary.
  298
  299The following theorem expands on how explainability can be computed:
  300\begin{theorem} \label{consisthm}
  301If ${\cal F} \cup C$ is consistent,
  302$g$ is explainable from ${\cal F},\Delta$ if and only if there is a ground
  303proof of $g$ from ${\cal F}\cup D$ where $D=\{d_1,...,d_n\}$
  304is a th_set of ground instances
  305of elements of $\Delta$ such that
  306${\cal F} \wedge C \wedge \{d_1,...,d_{i-1}\}\not\vdash \neg d_i$
  307for all $i,1\leq i \leq n$.
  308\end{theorem}
  309\begin{proof}
  310If $g$ is explainable from ${\cal F},\Delta$, there is a th_set $D$ of ground instances
  311of elements of $\Delta$ such that ${\cal F}\cup D \models g$ and ${\cal F} \cup D \cup C$
  312is consistent. So there is a ground proof of $g$ from ${\cal F} \cup D$.
  313By the preceding lemma
  314${\cal F}\cup D \cup C$ is consistent so there can be no sound proof
  315of inconsistency. That is, we cannot prove
  316${\cal F} \wedge C \wedge \{d_1,...,d_{i-1}\}\vdash \neg d_i$ for any $i$.
  317\end{proof}
  318
  319This leads us to the refinement of algorithm \ref{basic-alg}:
  320\begin{algorithm} \em
  321to explain $g$ from ${\cal F},\Delta$
  322\begin{enumerate}
  323\item Build a ground proof of $g$ from ${\cal F}\cup \Delta$. Make $D$ 
  324the th_set of instances of elements of $\Delta$ used in the proof.
  325\item For each $i$, try to prove $\neg d_i$ from ${\cal F} \wedge C
  326\wedge \{d_1,...,d_{i-1}\}$. If all
  327such proofs fail, $D$ is an explanation for $g$.
  328\end{enumerate}
  329\end{algorithm}
  330
  331Note that the ordering imposed on the $D$ is arbitrary. A sensible one is
  332the order in which the elements of $D$ were generated. Thus when
  333a new hypothesis is used in the proof, we try to prove its negation from
  334the facts and the previously used hypotheses. These proofs are independent
  335of the original proof and can be done as they are generated
  336as in negation as failure (see section \ref{incremental}), or can be done
  337concurrently.
  338
  339\subsection{Variables}
  340Notice that theorem \ref{consisthm} says that $g$ is explainable
  341if there is a ground proof. There is a problem that arises
  342to translate the preceding algorithm (which assumes ground proofs)
  343into an algorithm which does not build ground proofs (eg., a standard
  344resolution theorem prover), as we may have variables in the forms
  345we are trying to prove the negation of.
  346
  347A problem arises
  348when there are variables in the $D$ generated.
  349 Consider the following example:
  350\begin{example}\em
  351Let $ \Delta = \{p(X)\}$. That is, any instance of $p$ can be used if it is
  352consistent.
  353Let ${\cal F} = \{ \forall Y (p(Y) \Rightarrow g), \neg p(a)\}$. That is, $g$ is
  354true if there is a $Y$ such that $p(Y)$.
  355
  356According to our semantics,
  357$g$ is explainable with the explanation $\{p(b)\}$,
  358which is consistent with ${\cal F}$ (consider the interpretation $I=\{\neg p(a),p(b)\}$
  359on the domain $\{a,b\}$), and implies $g$.
  360
  361However,
  362if we try to prove $g$, we generate $D=\{p(Y)\}$ where $Y$ is free
  363(implicitly a universally quantified variable).
  364The existence of the fact $\neg p(a)$ should not make it
  365inconsistent, as we want $g$ to be explainable.
  366\end{example}
  367\begin{theorem}
  368It is not adequate to only consider interpretations in the
  369Herbrand universe of ${\cal F}\cup \Delta \cup C$ to determine explainability.
  370\end{theorem}
  371\begin{proof}
  372consider the example above; the Herbrand universe is just
  373the th_set $\{a\}$. Within this domain there is no consistent 
  374explanation to explain $g$.
  375\end{proof}
  376
  377This shows that Herbrand's theorem is not applicable to the whole system.
  378It is, however, applicable to each of the deduction steps \cite{chang}.
  379
  380So we need to generate a ground proof of $g$. This leads us to:
  381
  382\begin{algorithm} \em \label{det-alg}
  383To determine if $g$ is explainable from ${\cal F},\Delta$
  384\begin{enumerate}
  385\item generate a proof of $g$ using elements of ${\cal F}$ and $ \Delta$ as axioms.
  386Make $D_0$ the th_set of instances of $ \Delta$ used in the proof;
  387\item form $D_1$ by replacing free variables in $D_0$ with unique constants;
  388\item add $D_1$ to ${\cal F}$ and try to prove an inconsistency (as in the
  389previous case). If a
  390complete search for a proof fails, $g$ is explainable.
  391\end{enumerate}
  392\end{algorithm}
  393
  394This algorithm can now be directly implemented by a resolution theorem prover.
  395
  396\begin{example}\em
  397Consider ${\cal F}$ and $\Delta$ as in example 1 above.
  398If we try to prove $g$, we use the hypothesis instance $p(Y)$.
  399This means that $g$ is provable from any instance of $p(Y)$.
  400To show $g$ cannot be explained, we must show that all of the instances
  401are inconsistent. The above algorithm says
  402we replace $Y$ with a constant $\beta$.
  403$p(\beta)$ is consistent with the facts.
  404Thus we can show $g$ is explainable.
  405\end{example}
  406
  407\subsection{Incremental Consistency Checking} \label{incremental}
  408Algorithm \ref{det-alg} assumed that we only check consistency at the end.
  409We cannot replace free variables by a unique constant until the end
  410of the computation.
  411This algorithm can be further refined by considering cases
  412where we can check consistency at the time the hypothesis is generated.
  413
  414Theorem \ref{incremantal} shows that we can check consistency incrementally
  415in whatever order we like. The problem is to determine whether we have
  416generated the final version of a th_set of hypotheses.
  417If there are no variables in our th_set of hypotheses, then we can check
  418consistency as soon as they are generated.
  419If there are variables in a hypothesis, then we cannot guarantee that the
  420form generated will be the final form of the hypothesis.
  421\begin{example}\em
  422Consider the two alternate th_set of facts:
  423\begin{eqnarray*}
  424\Delta&=\{&p(X)\ \}\\
  425{\cal F}_1&=\{&\forall X \ p(X) \wedge q(X) \Rightarrow g,\\
  426&&\neg p(a),\\
  427&&q(b) \ \}\\
  428{\cal F}_2&=\{&\forall X \ p(X) \wedge q(X) \Rightarrow g,\\
  429&&\neg p(a),\\
  430&&q(a) \ \}
  431\end{eqnarray*}
  432Suppose we try to explain $g$ by first explaining $p$ and then explaining $q$.
  433Once we have generated the hypothesis $p(X)$, we have not enough information to
  434determine whether the consistency check should succeed or fail.
  435The consistency check for ${\cal F}_1$ should succeed (i.e, we should conclude
  436with a consistent instance, namely $X=b$), but the 
  437consistency check for ${\cal F}_2$ should fail (there is no consistent value
  438for the $X$ which satisfies $p$ and $q$).
  439We can only determine the consistency after we have proven $q$.
  440\end{example}
  441
  442There seems to be two obvious solutions to this problem,
  443the first is to allow the consistency check to return constraints on the
  444values (eg., \cite{edmonson}). The alternate (and simpler) solution is
  445to delay the evaluation of the consistency check until all of the variables
  446are bound (as in \cite{naish86}), or until we know that the variables
  447cannot be bound any more. In particular we know that a variable cannot be
  448bound any more at the end of the computation.
  449
  450The implementation described in this paper
  451does the simplest form of incremental consistency checking, namely it computes
  452consistency immediately for those hypotheses with no variables and delays
  453consistency checking until the end for hypotheses containing variables
  454at the time they are generated.
  455\section{The Deduction System} \label{deduction}
  456
  457This implementation is based on linear resolution \cite{chang,loveland78}.
  458This is complete in the
  459sense that if $g$ logically follows from some {\em consistent} th_set of
  460clauses $A$, then there is a linear resolution proof of $g$ from $A$.
  461
  462SLD resolution of Prolog \cite{lloyd} can be seen as linear resolution with
  463the contrapositive and ancestor search removed.
  464
  465To implement linear resolution in Prolog, we add two things
  466\begin{enumerate}
  467\item we use the contrapositive of our clauses. If we have the clause
  468\begin{verse}
  469$L_1 \vee L_2 \vee ... \vee L_n$
  470\end{verse}
  471then we create the $n$ rules
  472\begin{verse}
  473$L_1 \leftarrow \neg L_2 \wedge ... \wedge \neg L_n$\\
  474$L_2 \leftarrow \neg L_1 \wedge \neg L_3 \wedge ... \wedge \neg L_n$\\
  475$...$\\
  476$L_n \leftarrow \neg L_1 \wedge ... \wedge \neg L_{n-1}$
  477\end{verse}
  478as rules. Each of these can then be used to prove the left hand literal
  479if we know the other literals are false. Here we are treating the negation
  480of an atom as a different Prolog atom (i.e.,\ we treat $\neg p(\overline{X})$
  481as an atom $notp(\overline{X})$).
  482\item the ancestor cancellation rule. While trying to prove $L$ we can assume
  483$\neg L$. We have a subgoal proven if it unifies with the negation of
  484an ancestor in the proof tree.
  485This is an instance of proof by contradiction. We can see this as assuming
  486$\neg L$ and then when we have proven $L$ we can discharge the assumption.
  487\end{enumerate}
  488
  489One property of the deduction system that we want is the ability to
  490implement definite clauses with a constant factor overhead over
  491using Prolog. One way to do this is to keep two lists of ancestors
  492of any node: $P$ the positive (non negated atoms) ancestors
  493and $N$ the negated ancestors. Thus for a positive subgoal we
  494only need to search for membership in $N$ and for a negated subgoal we only
  495need to search $P$.
  496When we have definite clauses, there are no negated subgoals, and so 
  497$N$ is always empty. Thus the ancestor search always consists
  498of checking for membership in an empty list. The alternate
  499contrapositive form of the clauses are never used.
  500
  501Alternate, more complicated ways to do ancestor search
  502have been investigated \cite{poole:grace}, but this implementation
  503uses the very simple form given above.
  504Another tempting possibility is to use the near-Horn resolution
  505of \cite{loveland87}. More work needs to be done on this area.
  506\subsection{Disjunctive Answers}
  507For the compiler to work properly we need to be able to return
  508disjunctive answers. We need disjunctive answers for the case
  509that we can prove only a disjunctive form of the query.
  510
  511For example, if we can prove $p(a)\vee p(b)$ for the
  512query $?p(X)$, then we want the answer $X= a$ or $b$.
  513This can be seen as ``if the answer is not $a$ then the
  514answer is $b$''.
  515
  516To have the answer $a_1\vee...\vee a_n$, we need to have a proof
  517of ``If the answer is not $a_1$ and not $a_2$ and ... and not $a_{n-1}$
  518then the answer is $a_n$''.
  519We collect up conditions on the proof of
  520alternate answers that we are assuming are not true in order to have
  521the disjunctive answer.
  522
  523This is implemented by being able to assume the negation of the top level
  524goal as long as we add it to the th_set of answers. To do this we carry a list
  525of the alternate disjuncts that we are assuming in proving the top level goal.
  526\subsection{Conversion to Clausal Form}
  527It is desirable that we can convert an
  528arbitrary well formed formula into clausal (or rule) form
  529without mapping out subterms. Instead of distributing, we do this by
  530creating a new term to refer to the disjunct.
  531
  532Once a formula is in negation normal form, then the normal way 
  533to convert to clausal form \cite{chang} is to
  534convert something of the form
  535\[\alpha \vee (\beta \wedge \gamma)\]
  536by distribution into
  537\[(\alpha \vee \beta) \wedge (\alpha \vee \gamma)\]
  538and so mapping out subterms.
  539
  540The alternate \cite{poole:clausal} is to create a new relation $p$ parameterised
  541with the variables in common with $\alpha$ and $\beta \wedge \gamma$.
  542We can then replace $\beta \wedge \gamma$ by $p$ and then add
  543\[(\neg p \vee \beta)\wedge (\neg p \vee \gamma)\]
  544to the th_set of formulae.
  545
  546This can be embedded into the compiler by using
  547Prolog ``or'' instead of actually building the $p$. 
  548(Alternatively we can define ``or'' by defining the
  549clause $(p;q)\leftarrow p$ and $(p;q)\leftarrow q$.)
  550We build up the clauses so that the computation runs
  551without any multiplying out of subterms.
  552This is an instance of the general procedure of making clausal
  553theorem provers into non-clausal theorem provers \cite{poole:clausal}.
  554\section{Details of the Compiler}
  555In this section we give actual code which converts
  556Theorist code into Prolog code.
  557The compiler is described here in a bottom up fashion; from the
  558construction of the atoms to compilation of general formulae.
  559
  560The compiler is written in Prolog and the
  561target code for the compiler is Prolog code (in particular Horn
  562clauses with negation as failure). There are no ``cuts'' or other
  563non-logical ``features'' of Prolog which depend on Prolog's
  564search strategy in the compiled code.
  565Each Theorist wff gets locally translated into a th_set of
  566Prolog clauses.
  567\subsection{Target Atoms}
  568For each Theorist predicate symbol $r$ there are 4 target predicate
  569symbols, with the following informal meanings:
  570\begin{description}
  571\item[prove\_r] meaning $r$ can be proven from the facts and the constraints.
  572\item[prove\_not\_r] meaning $\neg r$ can be proven from the facts 
  573and the constraints.
  574\item[ex\_r] meaning $r$ can be explained from ${\cal F},\Delta$.
  575\item[ex\_not\_r] meaning $\neg r$ can be explained from ${\cal F},\Delta$.
  576\end{description}
  577
  578The arguments to these built predicate symbols will contain all
  579of the information that we need to prove or explain instances of the source
  580predicates.
  581\subsubsection{Proving}
  582For relation $r(-args-)$ in the source code we want to produce object
  583code which says that $r(-args-)$ (or its negation)
  584can be proven from the facts and constraints and the current th_set
  585of assumed hypotheses.
  586
  587For the source relation
  588\[r( - args -)\]
  589(which is to mean that $r$ is a relation with $-args-$ being the
  590sequence of its arguments),
  591we have the corresponding target relations
  592\[prove\_r( - args - , Ths, Anc)\]
  593\[prove\_not\_r( - args - , Ths, Anc)\]
  594which are to mean that $r$ (or $\neg r$) can be proven
  595>from the facts and ground hypotheses
  596$Ths$ with ancestor structure $Anc$.
  597These extra arguments are:
  598
  599\begin{description}
  600\item $Ths$ is a list of ground defaults.
  601These are the defaults we have already assumed and so define the context in
  602which to prove $r(-args-)$.
  603\item $Anc$ is a structure of the form $anc(P,N)$ where $P$ and $N$ are
  604lists of source atoms. Interpreted procedurally,
  605$P$ is the list of positive (not negated) ancestors of
  606the goal in a proof and $N$ is the list of negated ancestors
  607in a proof. As described in section \ref{deduction} we conclude some goal
  608if it unifies with its negated ancestors.
  609\end{description}
  610
  611Declaratively,
  612\[prove\_r( - args - , Ths, anc(P,N))\]
  613means
  614\[{\cal F}\cup Ths \cup \neg P \cup N \models r(-args-)\]
  615
  616\subsubsection{Explaining}
  617There are two target relations for explaining associated with
  618each source relation $r$. These are $ex\_r$ and $ex\_not\_r$.
  619
  620For the source relation:
  621\[r( - args -)\]
  622we have two target new relations for explaining $r$:
  623\[ex\_r( - args - , Ths, Anc, Ans)\]
  624\[ex\_not\_r( - args - , Ths, Anc, Ans)\]
  625These mean that $r(-args-)$ (or $\neg r(-args-)$) can be explained, with
  626\begin{description}
  627\item[$Ths$] is the structure of the incrementally built hypotheses
  628used in explaining $r$. There are two statuses of hypotheses we
  629use; one the defaults that are ground and so can be proven
  630consistent at the time of generation;
  631the other the hypotheses with free variables at the time they
  632are needed in the proof, for which we defer consistency
  633checking (in case the free variables get instantiated later in the proof).
  634$Ths$ is essentially
  635two difference lists, one of the ground defaults already
  636proven consistent and one of the
  637deferred defaults. $Ths$ is of the form
  638\[ths(T_1,T_2,D_1,D_2)\]
  639which is to mean that $T_1$ is the consistent hypotheses before
  640we try to explain $r$, and
  641and $T_2$ is the list of consistent hypotheses which includes
  642$T_1$ and those hypotheses assumed to explain $r$.
  643Similarly, $D_1$ is the list of deferred hypotheses before we consider the goal
  644and $D_2$ is the list of resulting deferred hypotheses used in explaining $r$.
  645
  646\item[$Anc$] contains the ancestors of the goal. As in the previous case,
  647this is a pair of the form
  648$anc(P,N)$ where $P$ is the list of positive ancestors of the goal,
  649and $N$ is the list of negated ancestors of the goal.
  650
  651\item[$Ans$] contains the answers we are considering in difference list form
  652$ans(A_1,A_2)$, where $A_1$ is the answers before
  653proving the goal, and $A_2$ is the answers after proving the goal.
  654\end{description}
  655
  656The semantics of
  657\[ex\_r(-args-,ths(T_1,T_2,D_1,D_2),anc(P,N),ans(A_1,A_2))\]
  658is defined by
  659\[{\cal F}\cup T_2 \cup D_2 \cup \neg P \cup N \cup A_2 \models r(-args-) \]
  660where $T_1\subseteq T_2$, $D_1\subseteq D_2$ and $A_1\subseteq A_2$, and
  661such that
  662\[{\cal F}\cup T_2 \hbox{ is consistent}\]
  663
  664\subsubsection{Building Atoms}
  665The procedure {\em new\_lit$($Prefix, Reln, Newargs, Newreln\/}$)$ constructs
  666a new atom, $Newreln$, with predicate symbol made up of
  667$Prefix$ prepended to the
  668predicate symbol of $Reln$, and taking as arguments the arguments of $Reln$
  669together with $Newargs$.
  670For example,
  671\begin{quote}
  672?-- new\_lit("ex\_",reln(a,b,c),[T,A,B],N).
  673\end{quote}
  674yields
  675\begin{quote}
  676N = ex\_reln(a,b,c,T,A,B)
  677\end{quote}
  678
  679The procedure is defined as follows\footnote{The verbatim code
  680is the actual implementation code given in standard Edinburgh notation.
  681I assume that the reader is familiar with such notation.}:
  682\index{new\_lit}
  683\index{add\_prefix}
  684\begin{verbatim} */
  685
  686
  687new_lit(Prefix, Reln, _, NewReln) :-
  688   th_flag((features,off)),!,
  689   Reln =.. [Pred | Args],
  690   add_prefix(Prefix,Pred,NewPred),
  691   NewReln =.. [NewPred | Args].
  692
  693new_lit(Prefix, Reln, NewArgs, NewReln) :-
  694   Reln =.. [Pred | Args],
  695   add_prefix(Prefix,Pred,NewPred),
  696   append(Args, NewArgs, AllArgs),
  697   NewReln =.. [NewPred | AllArgs].
  698
  699add_prefix(Prefix,Pred,NewPred) :-
  700   string_codes(Prefix,PrefixC),
  701   name(Pred,PredName),
  702   append(PrefixC, PredName, NewPredName),
  703   name(NewPred,NewPredName).
  704
  705
  706/* \end{verbatim}
  707\subsection{Compiling Rules}
  708The next simplest compilation form we consider is the intermediate form
  709called a ``rule''.
  710Rules are statements of how to conclude
  711the value of some relation. Each Theorist fact corresponds to a number
  712of rules (one for each literal in the fact).
  713Each rule gets translated into Prolog rules to explain
  714and prove the head of the rule. 
  715
  716Rules use the intermediate form called a ``literal''.
  717A literal is either an atomic symbol or of the form $n(A)$ where $A$ is
  718an atomic symbol.
  719A rules is either a literal or
  720of the form {\em H $\leftarrow$ Body} (written ``{\tt if(H,Body)}'')
  721where $H$ is a literal
  722and $Body$ is a conjunction and disjunction of literals.
  723
  724We translate rules of the form
  725\[h(-x-) \leftarrow b_1(-x_1-), b_2(-x_2-), ... ,b_N(-x_n-);\]
  726into the internal form (assuming that $h$ is not negated)
  727\begin{prolog}
  728$ex\_h(-x-,ths(T_0,T_n,D_0,D_n), anc(P,N), ans(A_0,A_n)) $\IF
  729$ex\_b_1(-x_1-,ths(T_0,T_1,D_0,D_1), anc([h(-x-)|P],N), ans(A_0,A_1))$\AND
  730$ex\_b_2(-x_2-,ths(T_1,T_2,D_1,D_2), anc([h(-x-)|P],N), ans(A_1,A_2))$\AND
  731$...$\AND
  732$ex\_b_n(-x_n-,ths(T_{n-1},T_n,D_{n-1},D_n), anc([h(-x-)|P],N),
  733ans(A_{n-1},A_n)).$
  734\end{prolog}
  735That is, we explain $h$ if we explain each of the $b_i$,
  736accumulating the explanations and the answers.
  737Note that if $h$ is negated, then the head of the clause will be of
  738the form $ex\_not\_h$, and the ancestor form will be
  739$anc(P,[h(-x-)|N])$. If any of the $b_i$ are negated, then the
  740corresponding predicate will be $ex\_not\_b_i$.
  741
  742\begin{example}\em
  743the rule
  744\begin{quote}
  745$gr(X,Y) \leftarrow f(X,Z), p(Z,Y)$
  746\end{quote}
  747gets translated into
  748\begin{prolog}
  749$ex\_gr(X,Y,ths(D,E,F,G),anc(H,I),ans(J,K))$\IF
  750$ex\_f(X,Z,ths(D,M,F,N),anc([gr(X,Y)|H],I),ans(J,O))$\AND
  751$ex\_p(Z,Y,ths(M,E,N,G),anc([gr(X,Y)|H],I),ans(O,K)).$
  752\end{prolog}
  753To explain $gr$ we explain both $f$ and $p$.
  754The initial assumptions for $f$ should be the initial assumptions for
  755$gr$, and the initial assumptions for $p$ should be the initial assumptions
  756plus those made to explain $f$. The resulting assumptions after proving $p$ are
  757are the assumptions made in explaining $gr$.
  758\end{example}
  759
  760\begin{example} \em the fact
  761\begin{quote}
  762$father(randy,jodi)$
  763\end{quote}
  764gets translated into
  765\begin{quote}
  766$ex\_father(randy,jodi,ths(T,T,D,D),\_,ans(A,A)).$
  767\end{quote}
  768We can explain $father(randy,jodi)$ independently of the ancestors;
  769we need no extra assumptions, and we create no extra answers.
  770\end{example}
  771
  772Similarly we translate rules of the form
  773\[h(-x-) \leftarrow b_1(-x_1-), b_2(-x_2-), ... ,b_N(-x_n-);\]
  774into
  775\begin{prolog}
  776$prove\_h(-x-, T, anc(P,N))$\IF
  777$prove\_b_1(-x_1-,T,anc([h(-x-)|P],N))$\AND
  778$...$\AND
  779$prove\_b_n(-x_n-,T,anc([h(-x-)|P],N)).$
  780\end{prolog}
  781
  782\begin{example} \em the rule
  783\begin{quote}
  784$gr(X,Y) \leftarrow f(X,Z), p(Z,Y)$
  785\end{quote}
  786gets translated into
  787\begin{prolog}
  788$prove\_gr(X,Y,D,anc(H,I))$\IF
  789$prove\_f(X,Z,D,anc([gr(X,Y)|H],I))$\AND
  790$prove\_p(Z,Y,D,anc([gr(X,Y)|H],I)).$
  791\end{prolog}
  792That is, we can prove $gr$ if we can prove $f$ and $p$.
  793Having $gr(X,Y)$ in the ancestors means that we can prove $gr(X,Y)$
  794by assuming that $\neg gr(X,Y)$ and then proving $gr(X,Y)$.
  795\end{example}
  796
  797\begin{example} \em the fact
  798\begin{quote}
  799$father(randy,jodi)$
  800\end{quote}
  801gets translated into
  802\begin{quote}
  803$prove\_father(randy,jodi,\_,\_).$
  804\end{quote}
  805Thus we can prove $father(randy,jodi)$ for any explanation and
  806for any ancestors.
  807\end{example}
  808
  809Disjuncts in the source body (;) get mapped into Prolog's disjunction.
  810The answers and assumed hypotheses should be accumulated from
  811whichever branch was taken.
  812This is then executed without mapping out subterms.
  813\begin{example} \em
  814The rule
  815\begin{quote}
  816$p(A) \leftarrow q(A),(r(A),s(A);t(A)),m(A).$
  817\end{quote}
  818gets translated into
  819\begin{tabbing}\hbox{2cm}\=\hbox{1cm}\=\kill
  820$prove\_p(A,B,anc(C,D)):-$\\
  821\>$prove\_q(A,B,anc([p(A)|C],D)),$\\
  822\>(\>$prove\_r(A,B,anc([p(A)|C],D)),$\\
  823\>\>$prove\_s(A,B,anc([p(A)|C],D))$\\
  824\>;\>$prove\_t(A,B,anc([p(A)|C],D))),$\\
  825\>$prove\_m(A,B,anc([p(A)|C],D)).$\\\\
  826
  827$ex\_p(A,ths(B,C,D,E),anc(F,G),ans(H,I)):-$\\
  828\>$ex\_q(A,ths(B,J,D,K),anc([p(A)|F],G),ans(H,L)),$\\
  829\>(\>$ex\_r(A,ths(J,M,K,N),anc([p(A)|F],G),ans(L,O)),$\\
  830\>\>$ex\_s(A,ths(M,P,N,Q),anc([p(A)|F],G),ans(O,R))$\\
  831\>;\>$ex\_t(A,ths(J,P,K,Q),anc([p(A)|F],G),ans(L,R))),$\\
  832\>$ex\_m(A,ths(P,C,Q,E),anc([p(A)|F],G),ans(R,I))$
  833\end{tabbing}
  834Note that $P$ is the resulting explanation from either executing
  835$r$ and $s$ or executing $t$ from the explanation $J$.
  836\end{example}
  837
  838\subsubsection{The Code to Compile Rules}
  839The following relation builds the desired structure for the bodies:
  840\[make\_bodies(B,T,[Ths,Anc,Ans],ProveB,ExB)\]
  841where $B$ is a disjunct/conjunct of literals (the body
  842of the rule), $T$ is a theory for the proving,
  843$Ths$ is a theory structure for explaining,
  844$Anc$ is an ancestor
  845structure (of form $anc(P,N)$), $Ans$ is an answer structure
  846(of form $ans(A0,A1)$). This procedure
  847makes $ProveB$ the body of forms $prove\_b_i$ (and $prove\_not\_b_i$),
  848and $ExB$ a body of the forms $ex\_b_i$.
  849
  850\index{make\_bodies}
  851\begin{verbatim} */
  852
  853make_bodies(A0, T, [Ths,Anc,Ans], ProveA, ExA) :- quietly(var_or_atomic(A0)),!,
  854   A = true_t(A0),
  855   !,
  856   new_lit("proven_", A, [T,Anc], ProveA),
  857   new_lit("ex_", A, [Ths,Anc,Ans], ExA).
  858
  859make_bodies((H,B), T, [ths(T1,T3,D1,D3), Anc, ans(A1,A3)],
  860                    (ProveH,ProveB), (ExH,ExB)) :-
  861   !,
  862   make_bodies(H,T,[ths(T1,T2,D1,D2),Anc,ans(A1,A2)],ProveH,ExH),
  863   make_bodies(B,T,[ths(T2,T3,D2,D3),Anc,ans(A2,A3)],ProveB,ExB).
  864
  865make_bodies((H;B),T,Ths,(ProveH;ProveB),(ExH;ExB)) :-
  866   !,
  867   make_bodies(H,T,Ths,ProveH,ExH),
  868   make_bodies(B,T,Ths,ProveB,ExB).
  869
  870make_bodies(n(A), T, [Ths,Anc,Ans], ProveA, ExA) :-
  871   !,
  872   new_lit("proven_not_", A, [T,Anc], ProveA),
  873   new_lit("ex_not_", A, [Ths,Anc,Ans], ExA).
  874
  875make_bodies(true,_,[ths(T,T,D,D),_,ans(A,A)],true,true) :- !.
  876make_bodies(unif(X,Y),_,[ths(T,T,D,D),_,ans(A,A)],
  877   unif(X,Y),unif(X,Y)) :-!.
  878make_bodies(A, T, [Ths,Anc,Ans], ProveA, ExA) :-
  879   !,
  880   new_lit("proven_", A, [T,Anc], ProveA),
  881   new_lit("ex_", A, [Ths,Anc,Ans], ExA).
  882
  883
  884/* \end{verbatim}
  885
  886The procedure $rule(F,R)$ declares $R$ to be a fact
  887or constraint rule (depending on the value of $F$).
  888Constraints can only be used for proving;
  889facts can be used for explaining as well as proving.
  890$R$ is either a literal or of the form $if(H,B)$ where $H$ is a literal
  891and $B$ is a body.
  892
  893This $rule$ first checks to see whether we want sound unification and
  894then uses $drule(F,R)$ to decare the rule.
  895
  896$prolog\_cl(C)$ is a way of asserting to Prolog the clause $C$.
  897This can either be asserted or written to a file to be consulted
  898or compiled. The simplest form is to just assert $C$.
  899
  900$make\_anc(H)$ is a procedure which ensures that the ancestor search
  901is th_set up properly for $H$. It is described in section \ref{anc-section},
  902and can be ignored on first reading.
  903
  904\index{rule}
  905\index{drule}
  906\begin{verbatim} */
  907
  908
  909rule(F,R) :-
  910   th_flag((sound_unification,on)),!,
  911   make_sound(R,S),
  912   drule(F,S).
  913rule(F,R) :-
  914   drule(F,R).
  915
  916drule(F,if(H,B)) :-
  917   !,
  918   make_anc(H),
  919   make_bodies(H,T,[Ths,Anc,Ans],ProveH,ExH),
  920   form_anc(H,Anc,Newanc),
  921   make_bodies(B,T,[Ths,Newanc,Ans],ProveB,ExB),
  922   prolog_cl(proven,(ProveH:-ProveB)),
  923   ( F= (fact),
  924     prolog_cl(thr,(ExH:-ExB))
  925   ; F=constraint).
  926
  927drule(F,H) :-
  928   make_anc(H),
  929   make_bodies(H,T,[ths(T,T,D,D),_,ans(A,A)],ProveH,ExH),
  930   prolog_cl(proven,ProveH),
  931   ( F= 'fact',
  932     prolog_cl(thr,ExH)
  933   ; F='constraint').
  934
  935
  936/* \end{verbatim}
  937
  938$form\_anc(L,A1,A2)$ means that $A2$ is the ancestor form for
  939subgoal $L$ with previous ancestor form $A1$.
  940
  941\index{form\_anc}
  942\begin{verbatim} */
  943
  944
  945form_anc(n(G), anc(P,N), anc(P,[G|N])) :- !.
  946form_anc(G, anc(P,N), anc([G|P],N)).
  947
  948
  949/* \end{verbatim}
  950\subsection{Forming Contrapositives}
  951For both facts and constraints we convert the user
  952syntax into negation normal
  953form (section \ref{th_nnf}), form the contrapositives,
  954and declare these as rules.
  955
  956Note that here we choose an arbitrary ordering for the clauses
  957in the bodies of the contrapositive forms of the facts. No
  958attempt has been made to optimise this, although it is noted that some
  959orderings are more efficient than others (see for example \cite{smith86}
  960for a discussion of such issues).
  961
  962The declarations are as follows:
  963\index{declare\_fact}
  964\index{declare\_constraint}
  965\begin{verbatim} */
  966
  967
  968declare_fact(F) :-
  969   th_nnf(F,even,N),
  970   wdmsgl(th_nnf=N),
  971   rulify(fact,N).
  972
  973declare_constraint(C) :-
  974   th_nnf(C,even,N),
  975   rulify(constraint,N).
  976
  977
  978/* \end{verbatim}
  979
  980{\em th_nnf\/$($Wff,Parity,Nnf\/$)$} (section \ref{th_nnf})
  981means that {\em Nnf\/} is the negation normal form
  982of {\em Wff} if {\em Parity=even} and of $\neg${\em Wff}
  983if {\em Parity=odd}. Note that we {\em rulify} the normal form
  984of the negation of the formula.
  985
  986{\em rulify\/}$(H,N)$ where $H$ is
  987either ``{\em fact\/}'' or ``{\em constraint\/}''
  988and $N$ is the negation of a fact or constraint
  989in negation normal form (see section \ref{th_nnf}),
  990means that all rules which can be formed from $N$ (by allowing each
  991atom in $N$ being the head of some rule) should be declared as such.
  992\index{rulify}
  993\begin{verbatim} */
  994
  995
  996rulify(H,A,OUT):-
  997  th_set((th_asserts,on)),
  998  th_set((features,on)),
  999  b_setval(th_asserts,[]),
 1000  rulify(H,A),
 1001  th_reset((th_asserts)),
 1002  th_reset((features)),!,
 1003  b_getval(th_asserts,OUT).
 1004  
 1005
 1006rulify(H,A) :- quietly(var_or_atomic(A)),!,
 1007   rule(H,n(A)).
 1008
 1009rulify(H,(A,B)) :- !,
 1010   contrapos(H,B,A),
 1011   contrapos(H,A,B).
 1012
 1013rulify(H,(A;B)) :- !,
 1014   rulify(H,A),
 1015   rulify(H,B).
 1016
 1017rulify(H,n(A)) :- !,
 1018   rule(H,A).
 1019
 1020rulify(H,A) :-
 1021   rule(H,n(A)).
 1022
 1023
 1024/* \end{verbatim}
 1025
 1026$contrapos(H,D,T)$ where $H$ is either ``{\em fact\/}'' 
 1027or ``{\em constraint\/}'', and $(D,T)$ is (the negation of)
 1028a formula in negation normal form means that all rules
 1029which can be formed from $(D,T)$ with head of the rule coming from $T$
 1030should be formed.
 1031Think of $D$ as the literals for which the rules with them as heads
 1032have been formed, and $T$ as those which remain to be as the head of
 1033some rule.
 1034\index{contrapos}
 1035\begin{verbatim} */
 1036
 1037
 1038contrapos(H,D,A) :- quietly(var_or_atomic(A)),!,
 1039   rule(H,if(n(A),D)).
 1040
 1041contrapos(H,D, (L,R)) :- !,
 1042   contrapos(H,(R,D),L),
 1043   contrapos(H,(L,D),R).
 1044
 1045contrapos(H,D,(L;R)) :- !,
 1046   contrapos(H,D,L),
 1047   contrapos(H,D,R).
 1048
 1049contrapos(H,D,n(A)) :- !,
 1050   rule(H,if(A,D)).
 1051
 1052contrapos(H,D,A) :-
 1053   rule(H,if(n(A),D)).
 1054
 1055
 1056/* \end{verbatim}
 1057\begin{example} \em
 1058if we are to {\em rulify} the negation normal form
 1059\begin{quote}
 1060$n(p(A)),q(A),(r(A),s(A);t(A)),m(A)$
 1061\end{quote}
 1062we generate the following rule forms, which can then be given to {\em rule}
 1063\begin{quote}
 1064$p(A)\leftarrow q(A),(r(A),s(A);t(A)),m(A)$\\
 1065$n(q(A))\leftarrow (r(A),s(A);t(A)),m(A),n(p(A))$\\
 1066$n(r(A))\leftarrow s(A),m(A),q(A),n(p(A))$\\
 1067$n(s(A))\leftarrow r(A),m(A),q(A),n(p(A))$\\
 1068$n(t(A))\leftarrow m(A),q(A),n(p(A))$\\
 1069$n(m(A))\leftarrow (r(A),s(A);t(A)),q(A),n(p(A))$
 1070\end{quote}
 1071\end{example}
 1072\subsection{Sound Unification}
 1073Sound unification works, by checking for repeated variables in the left
 1074hand side of a rule, and then unifies them by hand. This idea was stolen from 
 1075Stickel's implementation.
 1076
 1077\index{make\_sound}
 1078\index{rms}
 1079\begin{verbatim} */
 1080
 1081
 1082make_sound(if(H,B),if(NH,NB)) :- !,
 1083   rms(H,NH,[],_,B,NB).
 1084
 1085make_sound(H,NR) :-
 1086   rms(H,NH,[],_,true,NB),
 1087   (NB=true,NR=H;
 1088    NR=if(NH,NB)),!.
 1089
 1090rms(V,V1,L,L,B,(unif(V,V1),B)) :-
 1091   var(V),
 1092   id_member(V,L),!.
 1093rms(V,V,L,[V|L],B,B) :-
 1094   var(V),!.
 1095rms([H|T],[H1|T1],L1,L3,B1,B3) :- !,
 1096   rms(H,H1,L1,L2,B1,B2),
 1097   rms(T,T1,L2,L3,B2,B3).
 1098rms(A,A,L,L,B,B) :-
 1099   atomic(A),!.
 1100rms(S,S2,L1,L2,B1,B2) :-
 1101   S =.. L,
 1102   rms(L,LR,L1,L2,B1,B2),
 1103   S2 =.. LR.
 1104
 1105
 1106/* \end{verbatim}
 1107
 1108\index{unif}
 1109\index{appears\_in}
 1110\begin{verbatim} */
 1111
 1112unif(X,Y) :- unify_with_occurs_check(X,Y).
 1113/*
 1114unif(X,Y) :-
 1115   var(X), var(Y), X=Y,!.
 1116unif(X,Y) :-
 1117   var(X),!,
 1118   \+ appears_in(X,Y),
 1119   X=Y.
 1120unif(X,Y) :-
 1121   var(Y),!,
 1122   \+ appears_in(Y,X),
 1123   X=Y.
 1124unif(X,Y) :-
 1125   atomic(X),!,X=Y.
 1126unif([H1|T1],[H2|T2]) :- !,
 1127   unif(H1,H2),
 1128   unif(T1,T2).
 1129unif(X,Y) :-
 1130   \+ atomic(Y),
 1131   X=..XS,
 1132   Y=..YS,
 1133   unif(XS,YS).
 1134
 1135appears_in(X,Y) :-
 1136   var(Y),!,X==Y.
 1137appears_in(X,[H|T]) :- !,
 1138   (appears_in(X,H); appears_in(X,T)).
 1139appears_in(X,S) :-
 1140   \+ atomic(S),
 1141   S =.. L,
 1142   appears_in(X,L).
 1143*/
 1144
 1145/* \end{verbatim}
 1146\subsection{Possible Hypotheses}
 1147The other class of things we have to worry about is the class
 1148of possible hypotheses. As described in \cite{poole:lf}
 1149and outlined in section \ref{theorist},
 1150we only need worry about atomic possible hypotheses.
 1151
 1152If $d(-args-)$ is a possible hypothesis (default),
 1153then we want to form the target code as follows:
 1154
 1155\begin{enumerate}
 1156\item We can only prove a hypothesis if we have already assumed it:
 1157\begin{prolog}
 1158$prove\_d(-args-,Ths,Anc) $\IF
 1159$member(d(-args-),Ths).$
 1160\end{prolog}
 1161\item We can explain a default if we have already assumed it:
 1162\begin{prolog}
 1163$ex\_d(-args-,ths(T,T,D,D),Anc,ans(A,A)) $\IF
 1164$member(d(-args-),T).$
 1165\end{prolog}
 1166\item We can explain a hypothesis by assuming it,
 1167if it has no free variables, we have not
 1168already assumed it and it is consistent with everything assumed before:
 1169\begin{prolog} \em
 1170$ex\_d(-args-,ths(T,[d(-args-)|T],D,D),Anc,ans(A,A)) $\IF
 1171variable\_free$(d(-args-))$\AND
 1172$\backslash+(member(d(-args-),T))$\AND
 1173$\backslash+(prove\_not\_d(-args-,[d(-args-)|T],anc([],[]))).$
 1174\end{prolog}
 1175\item 
 1176If a hypothesis has free variables, it can be explained
 1177by adding it to the deferred defaults list (making no assumptions about
 1178its consistency; this will be checked at the end of the explanation phase):
 1179\begin{prolog}
 1180$ex\_d(-args-,ths(T,T,D,[d(-args-)|D],Anc,ans(A,A)) $\IF
 1181$\backslash+($variable\_free$(d(-args-))).$
 1182\end{prolog}
 1183\end{enumerate}
 1184
 1185The following compiles directly into such code:
 1186\index{declare\_default}
 1187\begin{verbatim} )*/
 1188
 1189
 1190declare_default(D) :-
 1191   make_anc(D),
 1192   new_lit("proven_",D,[T,_],Pr_D),
 1193   prolog_cl(thr,(Pr_D :- member(D,T))),
 1194   new_lit("ex_",D, [ths(T,T,Defer,Defer), _, ans(A,A)], ExD),
 1195   prolog_cl(thr,(ExD :- member(D, T))),
 1196   new_lit("ex_",D, [ths(T,[D|T],Defer,Defer), _, ans(A,A)], ExDass),
 1197   new_lit("proven_not_",D, [[D|T],anc([],[])],Pr_not_D),
 1198   prolog_cl(thr,(ExDass :- variable_free(D), \+member(D,T), \+Pr_not_D)),
 1199   new_lit("ex_",D, [ths(T,T,Defer,[D|Defer]), _, ans(A,A)], ExDefer),
 1200   prolog_cl(thr,(ExDefer :- \+ variable_free(D))).
 1201
 1202
 1203/* \end{verbatim}
 1204
 1205\begin{example}\em
 1206The default
 1207\begin{quote} \em
 1208birdsfly$(A)$
 1209\end{quote}
 1210gets translated into \em
 1211\begin{prolog}
 1212prove\_birdsfly$(A,B,C):-$\\
 1213\>member$(\hbox{birdsfly}(A),B)$\\
 1214ex\_birdsfly$(A,ths(B,B,C,C),D,ans(E,E)):-$\\
 1215\>member$(\hbox{birdsfly}(A),B)$\\
 1216ex\_birdsfly$(A,ths(B,[\hbox{birdsfly}(A)|B],C,C),D,ans(E,E)):-$\\
 1217\>variable\_free$(\hbox{birdsfly}(A)) ,$\\
 1218\>$\backslash+$member$(\hbox{birdsfly}(A),B),$\\
 1219\>$\backslash+$prove\_not\_birdsfly$(A,[\hbox{birdsfly}(A)|B],anc([],[]))$\\
 1220ex\_birdsfly$(A,ths(B,B,C,[\hbox{birdsfly}(A)|C]),D,ans(E,E)):- $\\
 1221\>$\backslash+$variable\_free$(\hbox{birdsfly}( A))$
 1222\end{prolog}
 1223\end{example}
 1224\subsection{Relations defined in Prolog}
 1225We can define some relations to be executed in Prolog.
 1226This means that we can prove the $prove$ and $ex$ forms by calling 
 1227the appropriate Prolog definition.
 1228\index{declare\_prolog}
 1229\begin{verbatim} */
 1230            
 1231
 1232declare_prolog(G) :-
 1233   new_lit("ex_",G, [ths(T,T,D,D), _, ans(A,A)], ExG),
 1234   prolog_cl(thr,(ExG :- G)),
 1235   new_lit("proven_",G,[_,_],PrG),
 1236   prolog_cl(proven,(PrG :- G)).
 1237
 1238
 1239/* \end{verbatim}
 1240
 1241\subsection{Explaining Observations}
 1242$expl(G,T0,T1,A)$ means that $T1$ is an explanation of $G$ or $A$ ($A$ being
 1243the alternate answers) from the facts given $T0$ is already assumed.
 1244$G$ is an arbitrary wff.
 1245\index{expl}
 1246\begin{verbatim} */
 1247
 1248
 1249expl(G,T0,T1,Ans) :-
 1250   make_ground(N),
 1251   declare_fact('<-'(newans(N,G) , G)),
 1252   ex_newans(N,G,ths(T0,T,[],D),anc([],[]),ans([],Ans)),
 1253   make_ground(D),
 1254   check_consis(D,T,T1).
 1255
 1256
 1257/* \end{verbatim}
 1258
 1259\index{check\_consis}
 1260\begin{verbatim} */
 1261
 1262
 1263check_consis([],T,T).
 1264check_consis([H|D],T1,T) :-
 1265   new_lit("proven_not_",H, [T1,anc([],[])], Pr_n_H),
 1266   \+ Pr_n_H,
 1267   check_consis(D,[H|T1],T).
 1268
 1269
 1270/* \end{verbatim}
 1271To obtain disjunctive answers we have to know if the negation of the top
 1272level goal is called. This is done by declaring the fact
 1273$newans(G) \leftarrow G$, and if we ever try to
 1274prove the negation of a top level goal, we add that instance to the
 1275list of alternate answers. In this implementation we also check
 1276that $G$ is not identical to a higher level goal. This removes most cases
 1277where we have a redundant assumption (i.e., when we are not gaining a new
 1278answer, but only adding redundant information).
 1279\index{ex\_not\_newans}
 1280\index{id\_anc}
 1281\begin{verbatim} */
 1282
 1283
 1284:- dynamic ex_not_newans/5. 1285:- dynamic ex_newans/5. 1286:- dynamic proven_not_newans/4. 1287ex_not_newans(N,G,ths(T,T,D,D),anc(Pos,Neg),ans(A,[G|A])) :-
 1288   member(newans(N,_),Pos),
 1289   \+ id_anc(G,anc(Pos,Neg)).
 1290
 1291id_anc(n(G),anc(_,N)) :- !, id_member(G,N).
 1292id_anc(G,anc(P,_)) :- id_member(G,P).
 1293
 1294
 1295/* \end{verbatim}
 1296
 1297\subsection{Ancestor Search} \label{anc-section}
 1298Our linear resolution
 1299theorem prover must recognise that a goal has been proven if
 1300it unifies with an ancestor in the search tree. To do this, it keeps
 1301two lists of ancestors, one containing the positive (non negated)
 1302ancestors and the other the negated ancestors.
 1303When the ancestor search rules for predicate $p$ are defined, we assert
 1304{\em ancestor\_recorded(p)}, so that we do not attempt to redefine the
 1305ancestor search rules.
 1306\index{make\_ex\_anc}
 1307\index{th_flag,ancestor\_search}
 1308\index{th_flag,loop\_check}
 1309\begin{verbatim} */
 1310
 1311
 1312:- dynamic ancestor_recorded/1. 1313make_anc(_) :-
 1314   th_flag((ancestor_search,off)),
 1315   th_flag((loop_check,off)),
 1316   th_flag((depth_bound,off)),
 1317   !.
 1318make_anc(Name) :-
 1319   call_u(ancestor_recorded(Name)),
 1320   !.
 1321make_anc(n(Goal)) :-
 1322   !,
 1323   make_anc(Goal).
 1324make_anc(Goal) :-
 1325   Goal =.. [Pred|Args],
 1326   same_length(Args,Nargs),
 1327   NG =.. [Pred|Nargs],
 1328   make_bodies(NG,_,[ths(T,T,D,D),anc(P,N),ans(A,A)],ProveG,ExG),
 1329   make_bodies(n(NG),_,[ths(T,T,D,D),anc(P,N),ans(A,A)],ProvenG,ExnG),
 1330   ( th_flag((loop_check,off))
 1331   ;
 1332     prolog_cl(proven,(ProveG :- id_member(NG,P),!,fail)),
 1333     prolog_cl(proven,(ProvenG :- id_member(NG,N),!,fail)),
 1334     prolog_cl(thr,(ExG :- id_member(NG,P),!,fail)),
 1335     prolog_cl(thr,(ExnG :- id_member(NG,N),!,fail))
 1336   ),
 1337   ( th_flag((ancestor_search,off))
 1338   ;
 1339     prolog_cl(proven,(ProveG :- member(NG,N))),
 1340     prolog_cl(proven,(ProvenG :- member(NG,P))),
 1341     prolog_cl(thr,(ExG :- member(NG,N))),
 1342     prolog_cl(thr,(ExnG :- member(NG,P)))
 1343   ),
 1344   ( th_flag((depth_bound,off)), !
 1345   ;
 1346     prolog_cl(thr,(ProveG :- (th_flag((depth_bound,MD))),
 1347            number(MD),length(P,LP),length(N,LN),LP+LN>MD,!,fail)),
 1348     prolog_cl(thr,(ProvenG :- (th_flag((depth_bound,MD))),
 1349            number(MD),length(P,LP),length(N,LN),LP+LN>MD,!,fail)),
 1350     prolog_cl(thr,(ExG :- (th_flag((depth_bound,MD))),
 1351            number(MD),length(P,LP),length(N,LN),LP+LN>MD,!,fail)),
 1352      prolog_cl(thr,(ExnG :- (th_flag((depth_bound,MD))),
 1353            number(MD),length(P,LP),length(N,LN),LP+LN>MD,!,fail))
 1354   ),
 1355   ain(ancestor_recorded(NG)),
 1356   !.
 1357
 1358
 1359/* \end{verbatim}
 1360
 1361\begin{example} \em
 1362If we do a call
 1363\begin{quote}
 1364make\_anc(gr(A,B))
 1365\end{quote}
 1366we create the Prolog clauses
 1367\begin{prolog}
 1368prove\_gr(A,B,C,anc(D,E))\IF
 1369member(gr(A,B),E).\\
 1370prove\_not\_gr(A,B,C,anc(D,E))\IF
 1371member(gr(A,B),D).\\
 1372ex\_gr(A,B,ths(C,C,D,D),anc(E,F),ans(G,G))\IF
 1373member(gr(A,B),F).\\
 1374ex\_not\_gr(A,B,ths(C,C,D,D),anc(E,F),ans(G,G))\IF
 1375member(gr(A,B),E).
 1376\end{prolog}
 1377This is only done once for the $gr$ relation.
 1378\end{example}
 1379
 1380\section{Interface}
 1381In this section a minimal interface is given. We try to give
 1382enough so that we can understand the conversion of the wff form
 1383into negation normal form and
 1384the parsing of facts and defaults. There is, of course,
 1385much more in any usable interface than described here.
 1386\subsection{Syntax Declarations}
 1387All of the declarations we use will be defined as operators.
 1388This will allow us to use infix forms of our wffs, for extra readability.
 1389Here we use the standard Edinburgh operator declarations which are
 1390given in the spirit of being enough to make the rest of the description
 1391self contained.
 1392\begin{verbatim} */
 1393
 1394     
 1395:- thread_local(t_l:th_flag_0/1). 1396:- op(1150,fx,'default'). 1397:- op(1150,fx,'fact'). 1398:- op(1150,fx,constraint). 1399:- op(1150,fx,prolog). 1400:- op(1150,fx,explain). 1401:- op(1150,fx,predict). 1402:- op(1150,fx,define). 1403:- op(1150,fx,th_set). 1404:- op(1150,fx,th_flag). 1405:- op(1150,fx,th_reset). 1406:- op(1150,fy,h). 1407:- op(1150,fx,thconsult). 1408:- op(1150,fx,thtrans). 1409:- op(1150,fx,thcompile). 1410:- op(1130,xfx,:). 1411
 1412:- op(1120,xfx,==). 1413:- op(1120,xfx,equiv). 1414:- op(1110,xfx,<-). 1415:- op(1110,xfx,=>). 1416:- op(1100,xfy,or). 1417:- op(1000,xfy,and). 1418:- op(1000,xfy,&). 1419:- op(950,fy,~). 1420:- op(950,fy,not). 1421
 1422
 1423/* \end{verbatim}
 1424
 1425
 1426\subsection{Converting to Negation Normal Form} \label{th_nnf}
 1427We want to convert an arbitrarily complex formula into a standard form
 1428called {\em negation normal form\/}. Negation normal form of a formula is
 1429an equivalent formula consisting of conjunctions and disjunctions of
 1430literals (either an atom or of the form $n(A)$ where $A$ is an atom).
 1431The relation defined here puts formulae into negation normal form
 1432without mapping out subterms.
 1433Usually we want to find the negation normal form of the negation of the
 1434formula, as this is the form suitable for use in the body of a rule.
 1435
 1436The predicate used is of the form
 1437\[th_nnf(Fla,Parity,Body)\]
 1438where
 1439\begin{description}
 1440\item[$Fla$] is a formula with input syntax
 1441\item[$Parity$] is either $odd$ or $even$ and denotes whether $Fla$ is
 1442in the context of an odd or even number of negations.
 1443\item[$Body$] is a tuple which represents the negation normal form
 1444of the negation of $Fla$
 1445if parity is even and the negation normal form of $Fla$ if parity is odd. 
 1446\end{description}
 1447\index{th_nnf}
 1448\begin{verbatim} */
 1449
 1450
 1451th_nnf_atom(F,odd,F).
 1452th_nnf_atom(F,even,n(F)).
 1453
 1454th_nnf(A, P, AA) :- (is_ftVar(A) ; \+ compound(A)),!,th_nnf_atom(A,P,AA).
 1455
 1456th_nnf((Y <- X), P,B) :- !,
 1457   th_nnf((X => Y), P,B).
 1458th_nnf((X == Y), P,B) :- \+ var(X), \+ var(Y),!,
 1459   th_nnf((X equiv Y), P,B).
 1460
 1461th_nnf('<=>'(X , Y), P,B) :- !,
 1462   th_nnf((X equiv Y), P,B).
 1463th_nnf((X & Y), P,B) :- !,
 1464   th_nnf((X and Y),P,B).
 1465th_nnf((X , Y), P,B) :- !,
 1466   th_nnf((X and Y),P,B).
 1467th_nnf((X ; Y), P,B) :- !,
 1468   th_nnf((X or Y),P,B).
 1469th_nnf(v(X , Y), P,B) :- !,
 1470   th_nnf((X or Y),P,B).
 1471th_nnf((~ X),P,B) :- !,
 1472   th_nnf((not X),P,B).
 1473th_nnf((- X),P,B) :- !,
 1474   th_nnf((not X),P,B).
 1475
 1476th_nnf((X => Y), P,B) :- !,
 1477   th_nnf((Y or not X),P,B).
 1478th_nnf((X equiv Y), P,B) :- !,
 1479   th_nnf(((Y or not X) and (X or not Y)),P,B).
 1480th_nnf((X and Y),P,B) :- !,
 1481   opposite_parity(P,OP),
 1482   th_nnf((not X or not Y),OP,B).
 1483th_nnf((X or Y),even,(XB,YB)) :- !,
 1484   th_nnf(X,even,XB),
 1485   th_nnf(Y,even,YB).
 1486th_nnf((X or Y),odd,(XB;YB)) :- !,
 1487   th_nnf(X,odd,XB),
 1488   th_nnf(Y,odd,YB).
 1489th_nnf((not X),P,B) :- !,
 1490   opposite_parity(P,OP),
 1491   th_nnf(X,OP,B).
 1492
 1493th_nnf(F,odd,F).
 1494th_nnf(n(F),even,F) :- !.
 1495th_nnf(F,even,n(F)).
 1496
 1497
 1498/* \end{verbatim}
 1499\index{opposite\_parity}
 1500\begin{verbatim} */
 1501
 1502
 1503opposite_parity(even,odd).
 1504opposite_parity(odd,even).
 1505
 1506
 1507/* \end{verbatim}
 1508
 1509\begin{example} \em
 1510the wff
 1511\begin{quote} \em
 1512(a or not b) and c $\Rightarrow$ d and (not e or f)
 1513\end{quote}
 1514with parity {\em odd} gets translated into
 1515\begin{quote}
 1516$d,(e;f);n(a),b;n(c) $
 1517\end{quote}
 1518the same wff with parity {\em even} (i.e., the negation of the wff)
 1519has negation normal form:
 1520\begin{quote}
 1521$(n(d);e,n(f)),(a;n(b)),c$
 1522\end{quote}
 1523\end{example}
 1524
 1525\subsection{Theorist Declarations}
 1526The following define the Theorist declarations.
 1527Essentially these operators just call the appropriate compiler
 1528instruction. These commands cannot be undone by doing a retry to them;
 1529the compiler assertions will be undone on backtracking.
 1530\index{fact}
 1531\begin{verbatim} */
 1532
 1533
 1534fact(F) :- declare_fact(F),!.
 1535
 1536
 1537/* \end{verbatim}
 1538
 1539The $default$ declaration makes the appropriate equivalences between the
 1540named defaults and the unnamed defaults.
 1541\index{default}
 1542\begin{verbatim} */
 1543
 1544
 1545default((N : H)):-
 1546   !,
 1547   declare_default(N),
 1548   declare_fact((H <-N)),
 1549   !.
 1550default( N ):-
 1551   declare_default(N),
 1552   !.
 1553
 1554
 1555/* \end{verbatim}
 1556\index{default}
 1557\begin{verbatim} */
 1558
 1559
 1560constraint(C) :-
 1561   declare_constraint(C),
 1562   !.
 1563
 1564                       
 1565/* \end{verbatim}
 1566The $prolog$ command says that the atom $G$ should be proven in the
 1567Prolog system. The argument of the $define$ statement is a Prolog
 1568definition which should be asserted (N.B. that it should be in
 1569parentheses if it contains a ``{\tt :-}''.
 1570\index{prolog}
 1571\begin{verbatim} 
 1572)
 1573*/
 1574
 1575                                   
 1576prolog( G ) :-
 1577   declare_prolog(G).
 1578
 1579
 1580/* \end{verbatim}
 1581\index{define}
 1582\begin{verbatim} */
 1583
 1584
 1585define( G ):-
 1586   prolog_cl(proven,G).
 1587
 1588
 1589/* \end{verbatim}
 1590
 1591The $explain$ command keeps writing out all of the explanations found.
 1592This is done by finding one, writing the answer, and then retrying so that
 1593the next answer is found. This is done so that the computation is left in
 1594an appropriate state at the end of the computation.
 1595\index{explain}
 1596\begin{verbatim} */
 1597
 1598
 1599% :- dynamic statistics/2.
 1600
 1601'explain'(G) :-
 1602   (th_flag((timing,on))),
 1603    statistics(runtime,_),
 1604    expl(G,[],D,A),
 1605    statistics(runtime,[_,Time]),
 1606    writeans(G,D,A),
 1607    format('took ~3d sec.~n~n',[Time]),
 1608    fail
 1609  ;
 1610    expl(G,[],D,A),
 1611    writeans(G,D,A),
 1612    fail.
 1613
 1614
 1615/* \end{verbatim}
 1616\index{writeans}
 1617\index{writedisj}
 1618\begin{verbatim} */
 1619
 1620
 1621writeans(G,D,A) :-
 1622   format('~nAnswer is ~p', [G]),
 1623   writedisj(A),
 1624   format('~nTheory is ~p~n', [D]),
 1625   !.
 1626
 1627writedisj([]).
 1628writedisj([H|T]) :-
 1629   writedisj(T),
 1630   format(' or ~p',[H]).
 1631
 1632
 1633/* \end{verbatim}
 1634
 1635\subsection{Prediction}
 1636In \cite{poole:lf} we present a sceptical view of prediction
 1637argue that one should predict what is in every extension.
 1638The following theorem proved in \cite{poole:lf} gives us a hint
 1639as to how it should be implemented.
 1640\begin{theorem} \label{everythm}
 1641$g$ is not in every extension iff there exists a scenario $S$ such
 1642that $g$ is not explainable from $S$.
 1643\end{theorem}
 1644
 1645The intuition is that
 1646if $g$ is not in every extension then there is no reason to rule out
 1647$S$ (based on the information given) and so we should not predict $g$.
 1648                      
 1649We can use theorem \ref{everythm} to consider another way to view membership
 1650in every extension. Consider two antagonistic agents $Y$ and $N$ trying to
 1651determine whether $g$ should be predicted or not. $Y$ comes
 1652up with explanations of $g$, and $N$ tries to find where these explanations
 1653fall down (i.e., tries to find a scenario $S$ which is inconsistent with
 1654all of $Y$''s explanations). $Y$ then tries to find an explanation of $g$
 1655given $S$.
 1656
 1657If $N$ cannot find a defeater for $Y$'s explanations then
 1658$g$ is in every extension, and if $Y$ cannot find an explanation from
 1659some $S$ constructed by $N$ then $g$ is not in every extension.
 1660                                                                       
 1661The following code implements this, but (as we cannot implement
 1662coroutines as needed above in Prolog), it may generate more
 1663explanations of the goal than is needed. What we really want is for the
 1664first ``bagof'' to generate the explanations in a demand-driven fashion,
 1665and then just print the explanations needed.
 1666
 1667\index{predict}
 1668\begin{verbatim} */
 1669                     
 1670
 1671predict(( G )):-
 1672  bagof(E,expl(G,[],E,[]),Es),
 1673  predct(G,Es). 
 1674 
 1675predct(G,Es) :- 
 1676  simplify_expls(Es,SEs),
 1677    ( find_counter(SEs,[],S),
 1678      !,
 1679      format('No, ~q is not explainable from ~q.~n',[G,S])
 1680    ; format('Yes, ~q is in all extensions.~nExplanations are:~n',[G]),
 1681      list_scens(1,SEs)).
 1682
 1683
 1684/* \end{verbatim}
 1685
 1686\index{find\_counter}
 1687\begin{verbatim} */
 1688
 1689
 1690find_counter([],S,S).
 1691find_counter([E|R],S0,S2) :-
 1692   member(D,E),
 1693   expl2not(D,S0,S1),
 1694   find_counter(R,S1,S2).
 1695
 1696
 1697/* \end{verbatim}
 1698
 1699\index{list\_scens}
 1700\begin{verbatim} */
 1701
 1702
 1703list_scens(_,[]).
 1704list_scens(N,[H|T]) :-
 1705   format('~q: ~q.~n',[N,H]),
 1706   N1 is N+1,
 1707   list_scens(N1,T).
 1708
 1709
 1710/* \end{verbatim}
 1711
 1712$expl2not(G,T0,T1)$ is true if ground $\neg G$ is explainable starting from
 1713scenario $T0$, with resulting explanation $T1$. No disjunctive answers are
 1714formed.
 1715\index{expl2}
 1716\begin{verbatim} */
 1717
 1718
 1719expl2not(G,T0,T1) :-
 1720   new_lit("ex_not_",G,[ths(T0,T,[],D),anc([],[]),ans([],[])],ExG),
 1721   ExG,
 1722   make_ground(D),
 1723   check_consis(D,T,T1).
 1724
 1725
 1726/* \end{verbatim}
 1727
 1728\subsection{Simplifying Explanations}
 1729\index{simplify\_obs}
 1730\begin{verbatim} */
 1731                                   
 1732
 1733simplify_expls([S],[S]).
 1734
 1735simplify_expls([H|T], S) :-
 1736   simplify_expls(T, TS),
 1737   mergeinto(H,TS,S).
 1738
 1739
 1740/* \end{verbatim}
 1741\index{mergeinto}
 1742\begin{verbatim} */
 1743
 1744
 1745mergeinto(L,[],[L]).
 1746
 1747mergeinto(L,[A|R],[A|R]) :-
 1748   instance_of(A,L),
 1749   !.
 1750
 1751mergeinto(L,[A|R],N) :-
 1752   instance_of(L,A),
 1753   !,
 1754   mergeinto(L,R,N).
 1755
 1756mergeinto(L,[A|R],[A|N]) :-
 1757   mergeinto(L,R,N).
 1758
 1759                         
 1760/* \end{verbatim}
 1761
 1762\index{instance\_of}
 1763\begin{verbatim} */
 1764
 1765
 1766instance_of(D,S) :-
 1767   remove_all(D,S,_).
 1768                           
 1769
 1770/* \end{verbatim}
 1771
 1772\subsection{File Handling}
 1773To consult a Theorist file, you should do a,
 1774\begin{verse}
 1775{\bf thconsult} \em filename.
 1776\end{verse}
 1777The following is the definition of {\em thconsult}. Basicly we just
 1778keep reading the file and executing the commands in it until we stop.
 1779\index{thconsult}
 1780\begin{verbatim} */
 1781
 1782th_read(T):- read_term(T,[module(snark_theorist),variable_names(Vs)]),b_setval('$variable_names',Vs).
 1783
 1784thconsult(( File )):-
 1785   current_input(OldFile),
 1786   open(File,read,Input),
 1787   set_input(Input),
 1788   th_read(T),
 1789   read_all(T),
 1790   set_input(OldFile).
 1791
 1792
 1793/* \end{verbatim}
 1794\index{read\_all}
 1795\begin{verbatim} */
 1796
 1797
 1798:- meta_predicate(read_all(*)). 1799read_all(end_of_file) :- !.
 1800
 1801read_all(T) :-
 1802   ((th_flag(( asserting,on))),!; format('~n% ~p.~n',[T])),
 1803   (call(T) *-> true ; format('% Warning: ~p failed~n',[T])),
 1804   th_read(T2),
 1805   read_all(T2).
 1806                 
 1807
 1808/* \end{verbatim}
 1809
 1810{\em thtrans} is like the previous version, but the generated code is written
 1811to a file. This code is neither loaded or compiled.
 1812\index{thtrans}
 1813\begin{verbatim} */
 1814
 1815
 1816thtrans(( File )):-
 1817   string_codes(File, Fname),
 1818   append(Fname,`.pl`,Plfname),
 1819   name(Plfile, Plfname),
 1820   current_output(Oldoutput),
 1821   open(Plfile,write,Output),
 1822   set_output(Output),
 1823   thtrans2out(File),
 1824   close(Output),
 1825   set_output(Oldoutput),!.
 1826   
 1827
 1828thtrans2out(File):-
 1829   current_input(Oldinput),
 1830   open(File,read,Input),
 1831   set_input(Input),
 1832   format(':- dynamic contrapos_recorded/1.~n',[]),
 1833   format(':- style_check(- singleton).~n',[]),
 1834   format(':- style_check(- discontiguous).~n',[]),
 1835   (th_set((asserting,off))),
 1836   th_read(T),
 1837   read_all(T),
 1838   set_input(Oldinput),
 1839   th_reset(( asserting)),!.
 1840
 1841/* \end{verbatim}
 1842To compile a Theorist file, you should do a,
 1843\begin{verse}
 1844{\bf thconsult} \em filename.
 1845\end{verse}
 1846
 1847This translates the code to Prolog and then compiles the prolog code.
 1848
 1849{\em thcompile} translates the file to Prolog
 1850which is then compiled using the Prolog compiler.
 1851\index{thcompile}
 1852\begin{verbatim} */
 1853
 1854
 1855thcompile(( File )):-
 1856   (thtrans(( File))),
 1857%   no_style_check(all),
 1858   compile(File).  
 1859
 1860
 1861/* \end{verbatim}
 1862
 1863
 1864\subsection{Flag Setting} \label{flags}
 1865There are a number of Theorist options which can be th_set by th_flag declarations.
 1866Flags, by default, are {\tt on}.
 1867To th_set the th_flag $f$ to value $v$ you can issue the command
 1868\begin{verse}
 1869\bf th_set $f,v.$
 1870\end{verse}
 1871To find out the value of the th_flag $f$ issue the command
 1872\begin{verse}
 1873\bf th_flag $f,V.$
 1874\end{verse}
 1875You can th_reset the value of th_flag $f$ to its old value by
 1876\begin{verse}
 1877\bf th_reset $f.$
 1878\end{verse}
 1879The list of all flags is given by the command
 1880\begin{verse}
 1881\bf flags.
 1882\end{verse}
 1883
 1884The following is the definition of these
 1885\index{th_set}
 1886\begin{verbatim} */
 1887
 1888
 1889th_set((F,V)):-
 1890   call(call,assert((t_l:th_flag_0((F,V1)):- !,V=V1))),!.
 1891
 1892
 1893/* \end{verbatim}
 1894\index{th_flag}
 1895\begin{verbatim} */
 1896
 1897th_flag(NV):-t_l:th_flag_0(NV).
 1898th_flag((_,off)).
 1899
 1900
 1901/* \end{verbatim}
 1902\index{th_reset}
 1903\begin{verbatim} */
 1904
 1905
 1906th_reset(F) :-
 1907   ignore(call(call,retract((t_l:th_flag_0((F,_)) :- !,_=_)))).
 1908
 1909
 1910/* \end{verbatim}
 1911\index{flags}
 1912\begin{verbatim} */
 1913
 1914
 1915flags :- list_flags([asserting,ancestor_search,loop_check,
 1916                     depth_bound,sound_unification,timing]).
 1917list_flags([]).
 1918list_flags([H|T]) :-
 1919   (th_flag((H,V))),
 1920   format('th_flag((~w,~w)).~n',[H,V]),
 1921   list_flags(T).
 1922                          
 1923
 1924/* \end{verbatim}
 1925\subsection{Compiler Directives}
 1926There are some compiler directives which need to be added to Theorist
 1927code so that the Prolog to assembly language compiler can work
 1928(these are translated into the appropriate Quintus compiler directives).
 1929
 1930So that the Quintus compiler can correctly compile the code,
 1931we should declare that all calls for which we can assert the goal
 1932or the negative are dynamic, this is done by the command
 1933\begin{verse}
 1934\bf dyn $n.$
 1935\end{verse}
 1936This need only be given in files,
 1937and should be given before the atomic symbol $n$ is ever used.
 1938
 1939The following gives the appropriate translation.
 1940Essentially we then must say that the appropriate Prolog code is dynamic.
 1941\index{explainable}
 1942\begin{verbatim} */
 1943
 1944
 1945:- op(1150,fx,(dyn)). 1946dyn(_) :-
 1947   (th_flag((asserting, on))),
 1948   !.
 1949dyn(n(G)) :-
 1950   !,
 1951   (dyn(G)).
 1952dyn(G):-
 1953   G =.. [R|Args],
 1954   add_prefix("ex_not_",R,ExNR),
 1955   add_prefix("ex_",R,ExR),
 1956   length(Args,NA),
 1957   ExL is NA + 3,
 1958   format(':- dynamic ~a/~d.~n',[ExNR,ExL]),
 1959   format(':- dynamic ~a/~d.~n',[ExR,ExL]),
 1960   add_prefix("proven_not_",R,PrNR),
 1961   add_prefix("proven_",R,PrR),
 1962   PrL is NA + 2,
 1963   format(':- dynamic ~a/~d.~n',[PrNR,PrL]),
 1964   format(':- dynamic ~a/~d.~n',[PrR,PrL]).
 1965
 1966                          
 1967/* \end{verbatim}
 1968
 1969\subsection{Using the Compiled Rules}
 1970The use of conditional asserting (prolog\_cl) is twofold.
 1971The first is to write the condition to a file if that is desired.
 1972The second is to be a backtrackable assert otherwise.
 1973\index{prolog\_cl}
 1974\index{th_flag,asserting}
 1975\begin{verbatim} */
 1976
 1977contains_prove(Body):- \+ compound(Body),!,fail.
 1978contains_prove((_ :- Body)):- !,contains_prove(Body). 
 1979contains_prove((Body,_)):-!,contains_prove(Body).
 1980contains_prove((Body;_)):-!,contains_prove(Body).
 1981contains_prove(PBody):- PBody =..[_,Body],!,\+ contains_prove(Body),!. 
 1982contains_prove(Body):- functor(Body,Was,_),atom_concat('prove',_,Was).
 1983
 1984% print_clause(Body) :- \+ contains_prove(Body),!. 
 1985print_clause(pfclog(_)) :-!.
 1986print_clause(boxlog(C)) :-!, print_clause(C).
 1987print_clause(C) :- fmt9(C).
 1988/*
 1989print_clause(C) :- 
 1990   \+ \+ (    
 1991     tnumbervars(C,0,_),
 1992     writeq(C),
 1993     write('.'),
 1994     nl).
 1995*/
 1996
 1997prolog_cl(proven,C):-!,prolog_cl(C).
 1998prolog_cl(thr,C):-!,prolog_cl(C).
 1999prolog_cl(_,C):-!,prolog_cl(C).
 2000
 2001prolog_cl(C) :- 
 2002  %  th_flag((th_asserts,on)),!,
 2003   b_getval(th_asserts,List),
 2004   b_setval(th_asserts,[C|List]),!.
 2005
 2006prolog_cl(C) :-  
 2007   print_clause(C),th_flag((asserting,off)), \+ th_flag((th_asserts,on)),!.
 2008prolog_cl(C) :- 
 2009   mpred_ain(C).
 2010prolog_cl(C) :-
 2011   mpred_retract(C),
 2012   fail.
 2013
 2014
 2015/* \end{verbatim}
 2016$prolog\_decl$ is like the above predicate, but is both
 2017written to the file and asserted.
 2018\index{prolog\_decl}
 2019\begin{verbatim} */
 2020                     
 2021
 2022prolog_decl(proven,C):-!,prolog_decl(boxlog(C)).
 2023prolog_decl(thr,C):-!,prolog_decl(pfclog(C)).
 2024prolog_decl(C) :-
 2025   th_flag((asserting,off)),
 2026   trace,print_clause(C),
 2027   fail.
 2028prolog_decl(C) :-
 2029   aina(C).
 2030prolog_decl(C) :-
 2031   mpred_retract(C),
 2032   fail.          
 2033              
 2034
 2035/* \end{verbatim}
 2036\subsection{Saving Theorist}
 2037The command ``save'' automagically saves the state of the Theorist code
 2038as the command "theorist". This is normally done straight after compiling this
 2039file.
 2040\index{save}
 2041\begin{verbatim} */
 2042
 2043
 2044save :-
 2045   call(call,quintus:save_program(th,
 2046   format('~nWelcome to THEORIST 1.1.1  (4 December 89 version)
 2047For help type ``h.''.
 2048Any Problems see David Poole (poole@cs.ubc.ca)~n',
 2049  []))).
 2050
 2051
 2052/* \end{verbatim}
 2053\section{Utility Functions}
 2054\subsection{List Predicates}
 2055$append(X,Y,Z)$ is the normal append function
 2056\index{append}
 2057\begin{verbatim} 
 2058append([],L,L).
 2059
 2060append([H|X],Y,[H|Z]) :-
 2061   append(X,Y,Z).
 2062\end{verbatim}
 2063
 2064\index{member}
 2065\begin{verbatim} */
 2066
 2067/*
 2068member(A,[A|_]).
 2069member(A,[_|R]) :-
 2070   member(A,R).
 2071*/
 2072
 2073/* \end{verbatim}
 2074
 2075$id\_member(X,L)$ is true if $X$ is identical to some member of list $L$.
 2076\index{id\_member}
 2077\begin{verbatim} */
 2078
 2079
 2080id_member(A,[B|R]) :-
 2081   (A==B->!;id_member(A,R)).
 2082
 2083
 2084/* \end{verbatim}
 2085
 2086\index{same\_length}
 2087\begin{verbatim} */
 2088
 2089
 2090/*same_length([],[]).
 2091same_length([_|L1],[_|L2]) :-
 2092   same_length(L1,L2).
 2093*/
 2094
 2095
 2096/* \end{verbatim}
 2097
 2098\index{remove}
 2099\begin{verbatim} */
 2100
 2101
 2102remove(A,[A|B],B).
 2103remove(A,[H|T],[H|R]) :-
 2104   remove(A,T,R).
 2105
 2106
 2107/* \end{verbatim}
 2108
 2109\index{remove\_all}
 2110\begin{verbatim} */
 2111
 2112
 2113remove_all([],L,L).
 2114remove_all([H|T],L,L2) :-
 2115   remove(H,L,L1),
 2116   remove_all(T,L1,L2).
 2117
 2118
 2119/* \end{verbatim}
 2120
 2121\subsection{Looking at Terms}
 2122\index{variable\_free}
 2123\begin{verbatim} */
 2124
 2125variable_free(X) :- ground(X).
 2126/*
 2127variable_free(X) :-
 2128   atomic(X),
 2129   !.
 2130variable_free(X) :-
 2131   var(X),
 2132   !,
 2133   fail.
 2134variable_free([H|T]) :-
 2135   !,
 2136   variable_free(H),
 2137   variable_free(T).
 2138variable_free(X) :-
 2139   X =.. Y,
 2140   variable_free(Y).
 2141*/
 2142
 2143/* \end{verbatim}
 2144
 2145\index{make\_ground}
 2146\begin{verbatim} */
 2147
 2148
 2149make_ground(X) :-
 2150   flag(groundseed,N,N),
 2151   tnumbervars(X,N,NN),
 2152   flag(groundseed,N,NN).
 2153
 2154/*
 2155:- dynamic groundseed/1.
 2156groundseed(26).
 2157*/
 2158:- flag(groundseed,_,26). 2159
 2160
 2161/* \end{verbatim}
 2162
 2163\index{reverse}
 2164\begin{verbatim} */
 2165
 2166
 2167reverse([],T,T).
 2168reverse([H|T],A,B) :-
 2169   reverse(T,A,[H|B]).
 2170
 2171
 2172/* \end{verbatim}
 2173                                                     
 2174\subsection{Help Commands}
 2175\index{h}
 2176\begin{verbatim} */
 2177
 2178
 2179(h) :- format('This is Theorist 1.1 (all complaints to David Poole)
 2180For more details issue the command:
 2181   h H.
 2182where H is one of:~n',
 2183[]),
 2184   unix(system('ls /faculty/poole/theorist/help')).
 2185
 2186(h(( H))) :- !,
 2187   add_prefix("more /faculty/poole/theorist/help/",H,Cmd),
 2188   unix(system(Cmd)).
 2189                              
 2190
 2191
 2192/* \end{verbatim}
 2193
 2194\subsection{Runtime Considerations}
 2195What is given here is the core part of our current implementation of
 2196Theorist.
 2197This code has been used with Waterloo Unix Prolog, Quintus Prolog,
 2198C-prolog and Mac-Prolog.
 2199For those Prologs with compilers we can actually compile the resulting
 2200code from this translater as we could any other Prolog code;
 2201this make it very fast indeed.
 2202
 2203The resulting code when the Theorist code is of the form of definite clauses 
 2204(the only case where a comparison makes sense,
 2205as it is what the two systems have in common), runs at about a quarter
 2206the speed
 2207of the corresponding interpreted or compiled code of the underlying
 2208Prolog system. About half of this extra cost is
 2209for the extra arguments to unify,
 2210and the other factor is for one membership
 2211of an empty list for each procedure call.
 2212For each procedure call we do one extra Prolog call which immediately fails.
 2213For the definite clause case, the contrapositive of the clauses are never used.
 2214\section{Conclusion}
 2215This paper has described in detail how we can translate Theorist code into
 2216prolog so that we can use the advances in Prolog implementation Technology.
 2217
 2218As far as this compiler is concerned there are a few issues which
 2219arise:
 2220\begin{itemize}                      
 2221\item Is there a more efficient way to determine that a goal can succeed because
 2222it unifies with an ancestor \cite{poole:grace,loveland87}?
 2223\item Can we incorporate a cycle check that has a low overhead?
 2224A simple, but expensive, version is implemented in some versions of
 2225our compiler which checks for identical ancestors.
 2226\item Are there optimal ordering which we can put the compiled
 2227clauses in so that we get answer most quickly \cite{smith86}?
 2228At the moment the compiler just puts the elements of the bodies
 2229in an arbitrary ordering. The optimal ordering depends, of course,
 2230on the underlying control structure.
 2231\item Are there better ways to do the consistency checking when there are
 2232variables in the hypotheses?
 2233\end{itemize}
 2234
 2235
 2236We are currently working on many applications of default and abductive
 2237reasoning.
 2238Hopefully with compilers based on the ideas presented in this paper
 2239we will be able to take full advantage of
 2240advances in Prolog implementation technology while still allowing
 2241flexibility in specification of the problems to be solved.
 2242\section*{Acknowledgements}
 2243This work could not have been done without the ideas,
 2244criticism and feedback from Randy Goebel, Eric Neufeld,
 2245Paul Van Arragon, Scott Goodwin and Denis Gagn\'e.
 2246Thanks to Brenda Parsons and Amar Shan for valuable comments on
 2247a previous version of this paper.
 2248This research was supported under NSERC grant A6260.
 2249\begin{thebibliography}{McDer80}
 2250\bibitem[Brewka86]{brewka86}
 2251G.\ Brewka,
 2252``Tweety -- Still Flying: Some Remarks on Abnormal Birds, Applicable Rules
 2253and a Default Prover'',
 2254{\em Proc.\ AAAI-86}, pp.\ 8-12.
 2255
 2256\bibitem[Chang73]{chang}
 2257C-L.\ Chang and R.\ C-T.\ Lee,
 2258{\em Symbolic Logic and Mechanical Theorem Proving},
 2259Academic Press, 1973.
 2260
 2261\bibitem[Cox82]{cox82}
 2262P.\ T.\ Cox, {\em Dependency-directed backtracking for Prolog Programs}.
 2263
 2264\bibitem[Cox87]{cox87}
 2265P.\ T.\ Cox and T.\ Pietrzykowski, {\em General Diagnosis by Abductive
 2266Inference}, Technical report CS8701, School of Computer Science,
 2267Technical University of Nova Scotia, April 1987.
 2268
 2269\bibitem[Dincbas87]{dincbas}
 2270M.~Dincbas, H.~Simonis and P.~Van Hentenryck,
 2271{\em Solving Large Combinatorial Problems in Logic Programming\/},
 2272ECRC Technical Report, TR-LP-21, June 1987.
 2273
 2274\bibitem[Doyle79]{doyle79}
 2275J.\ Doyle,
 2276``A Truth Maintenance System'',
 2277{\em Artificial Intelligence},
 2278Vol.\ 12, pp 231-273.
 2279
 2280\bibitem[de Kleer86]{dekleer86}
 2281J.\ de Kleer,
 2282``An Assumption-based TMS'',
 2283{\em Artificial Intelligence, Vol.\ 28, No.\ 2}, pp.\ 127-162.
 2284
 2285\bibitem[Edmonson87]{edmonson}
 2286R.~Edmonson, ????
 2287
 2288\bibitem[Enderton72]{enderton}
 2289H.\ B.\ Enderton, {\em A Mathematical Introduction to Logic},
 2290Academic Press, Orlando.
 2291
 2292\bibitem[Genesereth87]{genesereth87}
 2293M.\ Genesereth and N.\ Nilsson,
 2294{\em Logical Foundations of Artificial Intelligence},
 2295Morgan-Kaufmann, Los Altos, California.
 2296
 2297\bibitem[Ginsberg87]{ginsberg87}
 2298M.~L.~Ginsberg, {\em Computing Circumscription\/},
 2299Stanford Logic Group Report Logic-87-8, June 1987.
 2300
 2301\bibitem[Goebel87]{goebel87}
 2302R.\ G.\ Goebel and S.\ D.\ Goodwin,
 2303``Applying theory formation to the planning problem''
 2304in F.\ M.\ Brown (Ed.),
 2305{\em Proceedings of the 1987 Workshop on The Frame Problem in Artificial
 2306Intelligence}, Morgan Kaufmann, pp.\ 207-232.
 2307
 2308\bibitem[Kowalski79]{kowalski}
 2309R.~Kowalski, ``Algorithm = Logic + Control'',
 2310{\em Comm.~A.C.M.} Vol 22, No 7, pp.~424-436.
 2311
 2312\bibitem[Lifschitz85]{lifschitz85}
 2313V.~Lifschitz, ``Computing Circumscription'',
 2314{\em Proc.~IJCAI85}, pp.~121-127.
 2315
 2316\bibitem[Lloyd87]{lloyd}
 2317J.~Lloyd, {\em Foundations of Logic Programming},
 2318Springer-Verlag, 2nd Edition.
 2319
 2320\bibitem[Loveland78]{loveland78}
 2321D.~W.~Loveland, {\em Automated Theorem Proving: a logical basis},
 2322North-Holland, Amsterdam.
 2323
 2324\bibitem[Loveland87]{loveland87}
 2325D.~W.~Loveland, ``Near-Horn Logic Programming'',
 2326{\em Proc. 6th International Logic Programming Conference}.
 2327
 2328\bibitem[McCarthy86]{mccarthy86}
 2329J.\ McCarthy, ``Applications of Circumscription to Formalising
 2330Common Sense Knowledge'', {\em Artificial Intelligence}, Vol.\ 28, No.\ 1,
 2331pp.\ 89-116.
 2332
 2333\bibitem[Moto-Oka84]{pie}
 2334T.~Moto-Oka, H.~Tanaka, H.~Aida, k.~Hirata and T.~Maruyama,
 2335``The Architecture of a Parallel Inference Engine --- PIE'',
 2336{\em Proc.\ Int.\ Conf.\ on Fifth Generation Computing Systems},
 2337pp.~479-488.
 2338
 2339\bibitem[Naish86]{naish86}
 2340L.~Naish, ``Negation and Quantifiers in NU-PROLOG'',
 2341{\em Proc.~3rd Int.\ Conf.\ on Logic Programming},
 2342Springer-Verlag, pp.~624-634.
 2343
 2344\bibitem[Neufeld87]{neufeld87}
 2345E.\ M.\ Neufeld and D.\ Poole,
 2346``Towards solving the multiple extension problem:
 2347combining defaults and probabilities'',
 2348{\em Proc.\ Third AAAI Workshop on Reasoning with Uncertainty},
 2349Seattle, pp.\ 305-312.
 2350
 2351\bibitem[Poole84]{poole:clausal}
 2352D.\ L.\ Poole,
 2353``Making Clausal theorem provers Non-clausal'',
 2354{\em Proc.\ CSCSI-84}. pp.~124-125.
 2355
 2356\bibitem[Poole86]{poole:grace}
 2357D.\ L.\ Poole,
 2358``Gracefully adding Negation to Prolog'',
 2359{\em Proc.~Fifth International Logic Programming Conference},
 2360London, pp.~635-641.
 2361
 2362\bibitem[Poole86]{poole:dd}
 2363D.\ L.\ Poole,
 2364``Default Reasoning and Diagnosis as Theory Formation'',
 2365Technical Report, CS-86-08, Department of Computer Science,
 2366University of Waterloo, March 1986.
 2367
 2368\bibitem[Poole87a]{poole:vars}
 2369D.\ L.\ Poole,
 2370``Variables in Hypotheses'', 
 2371{\em Proc.~IJCAI-87}, pp.\ 905-908.
 2372
 2373\bibitem[Poole87b]{poole:dc}
 2374D.\ L.\ Poole,
 2375{\em Defaults and Conjectures: Hypothetical Reasoning for Explanation and
 2376Prediction}, Research Report CS-87-54, Department of
 2377Computer Science, University of Waterloo, October 1987, 49 pages.
 2378
 2379\bibitem[Poole88]{poole:lf}
 2380D.\ L.\ Poole,
 2381{\it A Logical Framework for Default Reasoning},
 2382to appear {\em Artificial Intelligence}, Spring 1987.
 2383
 2384\bibitem[PGA87]{pga}
 2385D.\ L.\ Poole, R.\ G.\ Goebel and R.\ Aleliunas,
 2386``Theorist: A Logical Reasoning System for Defaults and Diagnosis'',
 2387in N. Cercone and G. McCalla (Eds.)
 2388{\it The Knowledge Frontier: Essays in the Representation of
 2389Knowledge},
 2390Springer Varlag, New York, 1987, pp.\ 331-352.
 2391
 2392\bibitem[Reiter80]{reiter80}
 2393R.\ Reiter,
 2394``A Logic for Default Reasoning'',
 2395{\em Artificial Intelligence},
 2396Vol.\ 13, pp 81-132.
 2397                      
 2398\bibitem[Smith86]{smith86}
 2399D.~Smith and M.~Genesereth,
 2400``Ordering Conjunctive Queries'',
 2401{\em Artificial Intelligence}.
 2402
 2403\bibitem[Van Hentenryck87]{vanh}
 2404P.\ Van Hentenryck,
 2405``A Framework for consistency techniques in Logic Programming''
 2406IJCAI-87, Milan, Italy.
 2407\end{thebibliography}
 2408\printindex
 2409\end{document}
 2410*/
 2411
 2412tnumbervars(Term, N, M):- numbervars(Term, N, M).
 2413/*
 2414tnumbervars(Term, N, Nplus1) :-
 2415  var(Term), !,
 2416  Term = var/N,
 2417  Nplus1 is N + 1.
 2418tnumbervars(Term, N, M) :-
 2419  Term =.. [_|Args],
 2420  numberargs(Args,N,M).
 2421
 2422numberargs([],N,N) :- !.
 2423numberargs([X|L], N, M) :-
 2424  numberargs(X, N, N1),
 2425  numberargs(L, N1, M).      
 2426*/
 2427
 2428:- th_set((asserting,off)). 2429:- th_set((_,off)). 2430:- fixup_exports.