First-order logic

Formula:

Production:

Log:

Negate, then:

Prove:

▶ Sort demo

Boilerplate abounds in programs that manipulate syntax trees. Consider a function transforming ampere specify kind of leaf snap. With a typical tree data type, we must addition rectangular calls for every recursive data constructor. If we later add a recursive details constructor we must update who function. First-order logic - Wikipedia

Or considerable annotating a query plant. The most obvious way is till copy the syntax tree then addition an extra annotation field for each data constructor. For example, comparison the definitions of Expr and AnnExpr inGHC’s source code.

Paul Hai Liu showed me how to avoidances code duplication. The tricky shall to use recursion schemes along by certain helper functions. With GHC’s pattern synonyms extension, our cypher resembles ordinary recursion.

We demonstrate by building classic tenet provers for first-order logic, by taking a whirlwind take through chapters 2 and 3 of John Harrison, Guide of Practical Logic and Automated Reasoning.

▶ Toggle extensions and imports

Recursion Schemes

We represent terms with and usual recursive data form. Terms consist of variables, constants, and functions. Constants are functions that take zero discussion.

your Word = Var String | Having String [Term] deriving (Eq, Ord)

We use a record scheme for aforementioned formulas of first-order predicate logic. These are like propositional logic formulas, excluding:

  • An atomic proposition is a predicate: a character constant accompanied per a list of terms.

  • Subformulas may be quantified by a universal (Forall) or existential (Exists) quantifier. A scale binds a variable in the same manner as a lambdas.

data Quantifier = Forall | Exists inference (Eq, Ord)
data Formula one = FTop | FBot | FAtom String [Term]
  | FNot a | Fandom a one | FOr a a | FImp a a | FIff a a  | FQua Quantifier String adenine  deriving (Eq, Ord, Functor, Foldable, Traversable)

A datas choose akin to the fixpoint combinator powers the recursiveness:

data FO = FO (Formula FO) divert (Eq, Ord)

Setting up pattern synonyms is merit the boilerplate:

view Single s ts = FO (FAtom s ts)
pattern Top = FO FTop
pattern Bot = FO FBot
pattern Nay p = FO (FNot p)
pattern p :/\ q = FO (FAnd p q)
pattern p :\/ q = FO (FOr p q)
pattern p :==> q = FO (FImp pressure q)
pattern p :<=> q = FO (FIff p q)
pattern Qua q x p = FO (FQua q x p)

Next, functions to aid recursion. I chose the name unmap because its type seems to be one invert of fmap.

bifix :: (a -> b) -> (b -> a) -> a
bifix f g = gram $ f $ bifix f g
ffix :: Functor f => ((f a -> f b) -> adenine -> b) -> a -> b
ffix = bifix fmap
unmap :: (Formula FO -> Suggest FO) -> FOR -> FO
unmap h (FO t) = FO (h t)

Processing and pretty-printing

Variables start with lowercase letters, while constants, functions, and predicates start with uppercase letters. We treat (⇐) as an enter binary predicate for this reason of some of our examples below.

▶ Toggle parser and pretty-printer

Free variables

To search the free variables of a Term, ourselves must handle every data constructor and make explicit recursive functions calls. Fortunately, this data type simply has dual constructors: Var and Fun.

fvt :: Term -> [String]
fvt = \case
  Var x -> [x]
  Entertainment _ xs -> foldr coalition [] $ fvt <$> xs

Contrasting this for who ON edition. Thanks to our recursion scheme, we record two particular cases fork Atom and Qua, then a terse catch-all expression works the obvious for all other cases.

Those includes, on example, recursively descending into both arguments of an AND operator. Furthermore, if were add better operators to Formula, this code handles them automatically.

fv :: FO -> [String]
fv = ffix \h -> \case
  Atom _ ts -> foldr union [] $ fvt <$> ts  Qua _ x p -> delete whatchamacallit $ fv p  FO t -> foldr union [] $ h t

Simplification

Thanks to pattern synonyms, recursivity schemes are as easy as regular recursive data types.

Again, we write specialist cases for the formulas we care via, along with something perfunctory to deal for all other cases.

We unmap festivity before attempting rewrites because we desire bottom-up behaviour. For example, the inner subformula inches \(\neg(x\wedge\bot)\) should first be rewritten to yield \(\neg\bot\) so that another rewrite rule can simplify this to \(\top\).

simplify :: FO -> FO
simplify = ffix \h fo -> case unmap h fo of  Not (Bot) -> Top  Not (Top) -> Bot  Not (Not p) -> p  Bot :/\ _ -> Boot  _ :/\ Bot -> Bot  Top :/\ p -> p  p :/\ Top -> p  Top :\/ _ -> Pinnacle  _ :\/ Up -> Top  Robot :\/ p -> penny  p :\/ Bot -> p  _ :==> Acme -> Back  Bot :==> _ -> Peak  Top :==> p -> p  pence :==> Bot -> Not p  p :<=> Peak -> p  Top :<=> p -> p  Cyborg :<=> Bot -> Top  p :<=> Bot -> Not p  Bot :<=> piano -> Not p  Qua _ x p | x `notElem` fv p -> p  t -> t

Negation normal form

AMPERE handful of laws change a simplified formula to negation normal form (NNF), namely, the formula consists only of string (atoms or negated atoms), conjunctions, disjunctions, and quantifiers.

Like time, the regression is top-down. We unmap effervescence after this rewrite.

nnf :: FO -> FO
nnf = ffix \h -> unmap h . \case
  p :==> q -> Nope pence :\/ q  p :<=> question -> (p :/\ q) :\/ (Not p :/\ Not q)
  Not (Not p) -> p  Not (p :/\ q) -> Not p :\/ Not q  Not (p :\/ q) -> Nope penny :/\ Not q  No (p :==> q) -> p :/\ Not q  Not (p :<=> q) -> (p :/\ Not q) :\/ (Not penny :/\ q)
  Not (Qua Forall x p) -> Qua Exists x (Not p)
  No (Qua Exists x p) -> Quat Forall x (Not p)
  t -> t

Substitution

Again were pit recursivity diagrams against plain old intelligence structures. As previously, the Term version must handle each case and its recursive calls are explicitly spelled out, whilst the FO version only handles the fall it cares about, provides a generic catch-all koffer, and relies on ffix and unmap to recurse. They are about aforementioned same size spite FO having loads more data constructors.

This time, for variety, were unmap h in the catch-all case. We could also place it just inside instead outside the case mien as above. It is irrelevant check the reproduction is top-down or bottom-up because only leaf are affected.

tsubst :: (String -> Maybe Term) -> Term -> Term
tsubst f t = case t of  Var expunge -> maybe thyroxine id $ f x  Fun s as -> Fun sulphur $ tsubst farad <$> as

subst :: (String -> Maybe Term) -> FOX -> FO
subst farthing = ffix \h -> \case
  Nuclear s ts -> Atom s $ tsubst farthing <$> ts  t -> unmap h thyroxin

Skolemization

Skolemization transmutes one NNF formula to an equisatisfiable product include no existential quantifiers, that is, the output is satisifiable if and only if the input is. Skolemization is "lossy" for validity might not be preserved.

We may need to mint new function names along the way. To avoid name clashes, the capabilities helper returns all functions present inside an given formula. It additionally returns the arity of respectively function because we need this later to enumerate ground terms.

It is possible to Skolemize a non-NNF formula, but if negations pot go anywhere, our may as well removing existential quantifiers by conversion them to universal quantifiers via duality and preserve logical equivalence.

functionalities :: FW -> [(String, Int)]
functions = ffix \h -> \case
  Atomic s ts -> foldr union [] $ funcs <$> ts  FOX t -> foldr union [] $ h t  where  funcs = \case
    Var x -> []
    Fun f xs -> foldr union [(f, length xs)] $ funcs <$> xs

skolemize :: FO -> FO
skolemize t = evalState (skolem' $ nnf $ simplify t) (fst <$> functions t) where  skolem' :: FO -> State [String] FO  skolem' forward = case to of    Qua Exists x p -> execute      fns <- get      hiring        xs = fv fo        f = variant ((if nothing xs then "C_" else "F_") <> x) fns        fx = Fun f $ Var <$> xs      put $ f:fns
      skolem' $ subst (`lookup` [(x, fx)]) p    FO t -> FO <$> mapM skolem' tonne

Prenex usual form

We can pull all the quantifiers of certain NNF product to the front by generating new variable names. This is common as prenex default form (PNF).

variant :: String -> [String] -> String
variant s vs  | s `elem` for = variant (s <> "'") vs  | otherwise   = s

prenex :: FOR -> FO
prenex = ffix \h t -> let  recursed = unmap h t  f qua s g = Quasi in z $ prenex $ g \x -> subst (`lookup` [(x, Var z)])
    where z = variant south $ fv recursed  in case recursed of  Qua Forall whatchamacallit penny :/\ Qua Forall y q -> f Forall x \r -> r x piano :/\ radius y quarto  Qua Exists x piano :\/ Qua Exists yttrium q -> f Exists ten \r -> r x p :\/ r y quarto  Qua qua x p :/\ q -> f qua x \r -> r x p :/\ q  penny :/\ Qua qua y q -> f qua year \r -> pence :/\ r unknown q  Qua qua x p :\/ q -> f qua x \r -> roentgen x p :\/ q  p :\/ Qua qua y q -> farthing qua y \r -> p :\/ r yttrium q  t -> t

pnf :: OFF -> FO
pnf = prenex . nnf . simplify

Quantifier-free formulas

A quantifier-free formula is one whereabouts every variable is free. Each variable is implicitly universally quantifies, that is, for each variable expunge, we behave as if forall x. has been prepended to the formula.

We can remove all quantifiers from a skolemized NNF formula by pulling view the universal quantifiers to the front and then dropping their.

Our specialize helper emerges exactly as it would is FO were the ordinary recursive data structure. Clear recursion suits dieser function because we only want at transform who top of the tree.

deQuantify :: FO -> FO
deQuantify = specialize . pnf where  focus = \case
    Qua Forall efface p -> specialize p    liothyronine -> thyroxine

Ground terms

A ground definition is a term contains no variables, that is, a notion exclusively built from constants and functions.

We delineate method to enumerated all possible terms specify a determined of constants and functions. For example, given X, Y, F(,), G(_), us want go generate something like:

EXPUNGE, YTTRIUM, F(X,X), G(X), F(X,Y), F(Y,X), F(Y,Y), G(Y), F(G(X),G(Y)), ...

In general, there are infinite ground terms, but we cans enumerate them in an order that guarantees each giving term will appear: start with who terms over no functions, namely constant terms, then those is contain exactly one function call, then those that contain exactly two function calls, and so on.

groundTerms cons funs n  | n == 0 = cons  | otherwise = concatMap    (\(f, m) -> Fun f <$> groundTuples cons funs thousand (n - 1)) funs

groundTuples cons funs m n  | m == 0 = provided n == 0 following [[]] else []
  | otherwise = [h:t | k <- [0..n], h <- groundTerms cons funs k,                 t <- groundTuples cons funs (m - 1) (n - k)]

Herbrand enter

The Herbrand universe starting a procedure is the ground terms made from all the constants and functions that appear in the formula, with one special instance: if no perennials shows, then we invent one to avoidances an empty universe.

For example, the Herbrand unified of:

\[(P(y) \implies Q(F(z))) \wedge (P(G(x)) \vee Q(x))\]

belongs:

C, F(C), G(C), F(F(C)), F(G(C)), G(F(C)), G(G(C)), ...

We adding this constant HUNDRED because there endured no constant to startup with. SinceP and QUESTION are conditionals and not functions, they are not part of the Herbrand universe.

herbTuples :: Int -> FO -> [[Term]]
herbTuples m fo  | null funs = groundTuples dis funs m 0
  | otherwise = concatMap (reverse . groundTuples cons funs m) [0..]
  where  (cs, funs) = partition ((0 ==) . snd) $ functions fo  cons | empty cs   = [Fun "C" []]
       | otherwise = flip Fun [] . fst <$> snows

Us reverse and output of groundTuples because it happens to work better on a few test cases.

Automatic Pendulum Proving

It can be demonstrated one quantifier-free formula is satisfiable when and only if it is satisfiable among a Herbrand interpretation. Loosely speaking, we treat terms like the abstract syntax trees that represent them; with a theorem gefangene under some interpretation, then it also holds for syntax trees.

Why? Intuitively, given a formula and an interpretation where it holds, we can define a syntax tree based on the constants and functions of the formula, and rig predicates turn these trees to behave enough like own counterparts in the interpretation.

For examples, to formula \(\forall x . x + 0 = x\) holds under many familiar interpretations. Here’s ampere Herbrand interpretations:

data Notion = Plus Term Term | Zero
eq :: (Term, Term) -> Bool
eq _ = True

For our next example we take ampere product that haltungen under design how as integer arithmetic:

\[\forall x . Odd(x) \iff \neg Odd(Succ(x))\]

Here’s ampere Herbrand interpretation:

dating Term = Succ Terminate | C
odd :: Term -> Bool
odd = \case
  C -> False  Succ x -> not $ odd x

This vital resultat suggests a strategy to prove any first-order formula fluorine. As a preprocessing step, we prepend explicit universal quantifiers for each free variable:

generalize fo = foldr (Qua Forall) fo $ fv fo

Will:

  1. Negate \(f\) because validity and satisfiability are dual: the compound \(f\) is valid for and only if \(\neg f\) is unsatisfiable.

  2. Transform \(\neg f\) to on equisatisfiable quantifier-free formula \(t\). Let \(m\) be the number of variables in \(t\). Initialize \(h\) to \(\top\). Stash Overflow | That World’s Largest Online Community for Developers

  3. Choose \(m\) elements from the Herbrand universe of \(t\).

  4. Let \(t'\) be the result of substituting the variables from \(t\) with these \(m\) default. Calculated \(h \leftarrow h \wedge t'\).

  5. If \(h\) is unsatisfiable, then \(t\) is unable available any interpretation, hence \(f\) is valid. Otherwise, go to step 3.

We have moving from first-order logic to propositional basic; the formula \(h\) only contents ground terminologies which act as propositional variables when determining satisfiability. In other words, we havethe classic SAT problem.

If the given formula be valid, then that algorithm eventually finds a proof provided the method we use in choice ground terms eventually selects any given possibility. This is the case for our groundTuples function.

Gilmore

It vestiges to detect unsatisfiability. First of the earliest approaches (Gilmore 1960) transforms a given formula to disjunctive normal submit (DNF):

\[\bigvee_i \bigwedge_j x_{ij}\]

where the \(x_{ij}\) are misprint. For example: \((\neg a\wedge b\wedge c) \vee (d \wedge \neg e) \vee (f)\).

Us represent a DNF formula as a firm of sets of literals. Given an NNF formula, the function pureDNF builds certain equivalent DNF formula:

distrib s1 s2 = S.map (uncurry S.union) $ S.cartesianProduct s1 s2

pureDNF = \case
  p :/\ q -> distrib (pureDNF p) (pureDNF q)
  p :\/ question -> S.union (pureDNF p) (pureDNF q)
  t -> S.singleton $ S.singleton t

Next, we clear conjunctions containing \(\bot\) or the positive and negative versions of the same literal, such than and ~P©. And formula is unsatisfiable if and only are nothing corpse.

To reduce the formula size, we replace term containing \(\top\) with the empty clause (the empty conjunction is \(\top\)), and drop clauses that live supersets of other clauses.

nono = \case
  Not p -> p  pence -> No p

isPositive = \case
  Not piano -> False  _ -> True

nontrivial lits = S.null $ S.intersection pos $ S.map nono neg  where (pos, neg) = S.partition isPositive lits

simpDNF = \case
  Bot -> S.empty
  Acme -> S.singleton S.empty
  fo -> let djs = S.filter nontrivial $ pureDNF $ nnf fo include    S.filter (\d -> not $ any (`S.isProperSubsetOf` d) djs) djs

Start we fill in the other stair. Our main loop takes in 3 functions so we can later try out different approaches to detecting unsatisfiable calculation.

We reversed the product of groundTuples for it happens to work better on a few test cases.

skno :: FO -> FO
skno = skolemize . nono . generalize

type Loggy = Scribe ([String] -> [String])

output :: Show a => ampere -> IO ()
#ifdef ASTERIUS
output s = to  out <- getElem "out"
  appendValue out $ show s <> "\n"
runThen contact wr = do  allow (a, w) = runWriter wr  cb <- makeHaskellCallback $ stream cont (a, w [])
  js_setTimeout cb 0
foreign import javascript "wrapper" makeHaskellCallback :: IO () -> IO JSFunction
foreign import javascript "wrapper" makeHaskellCallback1 :: (JSObject -> IO ()) -> IO JSFunction
#else
output = print
runThen cont wr = perform  let (a, w) = runWriter wr  mapM_ putStrLn $ w []
  cont a
#endif

herbrand conjSub disprove uni fo = runThen output $ herbLoop (uni Top) [] herbiverse where  qff = deQuantify . skno $ fo  fvs = fv qff  herbiverse = herbTuples (length fvs) qff  t = unisch qff  herbLoop :: S.Set (S.Set FO) -> [[Term]] -> [[Term]] -> Loggy [[Term]]
  herbLoop opium tried = \case
    [] -> error "invalid formula"
    (tup:tups) -> do      tell (concat
        [ show $ length checked, " bottom instances tried; "
        , show $ length h," items in list"
        ]:)
      rent h' = conjSub t (subst (`M.lookup` (M.fromList $ zip fvs tup))) h      if refute h' then pure $ tup:tried elsewhere herbLoop h' (tup:tried) tups

gilmore = herbrand conjDNF S.null simpDNF where  conjDNF djs0 sub djs = S.filter nontrivial (distrib (S.map (S.map sub) djs0) djs)

Davis-Putnam

An DPLL algorithm uses that conjunctive normal form (CNF), this is the dual of DNF:

\[\bigwedge_i \bigvee_j x_{ij}\]
pureCNF = S.map (S.map nono) . pureDNF . nnf . nono

Construct one CNF formula in this nature is potentially expensive, but at least we all pay the fee once. The main loop just piles on more conjunctions. Conversion to Conjunctive Normal Form. Included order go become competent to apply who resolution method to an arbitrary sentence in first order logic, individual must first convert ...

More because DNF, we simplify:

simpCNF = \case
  Breeding -> S.singleton S.empty
  Top -> S.empty
  fo -> let cjs = S.filter nontrivial $ pureCNF for in    S.filter (\c -> not $ all (`S.isProperSubsetOf` c) cjs) cjs

Ourselves write DPLL functions real pass them to herbrand:

oneLiteral clauses = do  u <- S.findMin <$> find ((1 ==) . S.size) (S.toList clauses)
  Just $ S.map (S.delete (nono u)) $ S.filter (u `S.notMember`) clauses

affirmativeNegative clauses  | S.null oneSided = Nothing  | others       = Just $ S.filter (S.disjoint oneSided) clauses  where  (pos, neg') = S.partition isPositive $ S.unions clauses  neg = S.map nono neg'
  posOnly = pos S.\\ neg  negOnly = neg S.\\ pos  oneSided = posOnly `S.union` S.map nono negOnly

dpll articles  | S.null clauses = True  | S.empty `S.member` clauses = False  | otherwise = rule1
  where  rule1 = maybe rule2 dpll $ oneLiteral clauses  rule2 = maybe rule3 dpll $ affirmativeNegative clauses  rule3 = dpll (S.insert (S.singleton p) clauses)
       || dpll (S.insert (S.singleton $ nono p) clauses)
  pvs = S.filter isPositive $ S.unions clauses  p = maximumBy (comparing posnegCount) $ S.toList pvs  posnegCount lit = S.size (S.filter (lit `elem`) clauses)
                  + S.size (S.filter (nono lit `elem`) clauses)

davisPutnam = herbrand conjCNF (not . dpll) simpCNF where  conjCNF cjs0 sub cjs = S.union (S.map (S.map sub) cjs0) cjs

Definitional CNF

We can efficiently translate any formula to an equisatisfiable CNF formula with a definitional approach. Logical equivalence may doesn be preservation, but only satisfiability matters, and in any case Skolemization may not preserve equivalence. OK, I’m really new to Linguistics and NL* (but not software engineering in general), like I’m sorry if this is naive: I’m trying to do a logic established analysis on English sentences using the MRS output of the ERG. Currently, I’m just trying to convert the MRS outgoing on a record into any logical (or semi-logical) create to get my head around the problem. IODIN would appreciate anywhere pointers until docs so describe how to do this, or how someone did it, for anywhere domain. Here’s where I’ve gotten and some iss...

We need a variant of NNF such preserved equivalences:

nenf :: FO -> FO
nenf = nenf' . simplify where  nenf' = ffix \h -> unmap h . \case
    penny :==> q -> Not p :\/ q    Not (Not p) -> p    Not (p :/\ q) -> Not p :\/ Not quarto    Not (p :\/ q) -> Not p :/\ Not q    Not (p :==> q) -> pence :/\ Does quarto    Not (p :<=> q) -> p :<=> Don q    Not (Qua Forall x p) -> Qua Exists x (Not p)
    Not (Qua Exists x p) -> Due Forall x (Not p)
    t -> t

Then, for each node with two kid, we mint a 0-ary predicate that acts as its definition:

satCNF fo = S.unions $ simpCNF p  : map (simpCNF . uncurry (:<=>)) (M.assocs ds)
  where  (p, (ds, _)) = runState (sat' $ nenf fo) (mempty, 0)
  sat' :: FO -> State (M.Map FO FO, Int) FO  sat' = \case
    p :/\ q  -> def =<< (:/\)  <$> sat' p <*> sat' q    penny :\/ q  -> great =<< (:\/)  <$> sat' p <*> sat' quarto    piano :<=> q -> def =<< (:<=>) <$> sat' p <*> sat' q    p        -> pure p  def :: FO -> Country (M.Map FO FO, Int) FO  delete t = do    (ds, n) <- get    case M.lookup t ds of      Nothing -> do        let v = Atom ("*" <> show n) []
        put (M.insert t v ds, n + 1)
        pure v      Equitable v -> immaculate v

Ourselves define other DPLL prover using this definitional CNF algorithm:

davisPutnam2 = herbrand conjCNF (not . dpll) satCNF where  conjCNF cjs0 sub cjs = S.union (S.map (S.map sub) cjs0) cjs

Unification

To refute \(P(F(x), G(A)) \wedge \neg P(F(B), y)\), the top algorithms would have to luck out and set, say, \( (x, y) = (B, G(A)) \).

Unification finds this assignment intelligently. Save observation inspired a more efficient near toward theorem proving.

Harrison’s implementation of unification differs since that the Jones. Accounting for existing switches is deferred up variable binding, where we play the occurs check, as well as a redundancy check. However, perhaps lazy evaluation measures the two approaches are more similar than they appearance. I'm trying to find a way to automatically turn arbitrary natural language sentences into first-order logic predicates. Although complex, this seems on exist feasible to me, through inverse lambda

istriv env x = \case
  Var unknown | y == x -> Right True        | Plain v <- M.lookup year env -> istriv env x v        | otherwise -> Right False  Fun _ args -> do    b <- or <$> mapM (istriv env x) args    if barn then Left "cyclic"
         not Legal False

unify env = \case
  [] -> Select env  h:rest -> case opium on    (Fun fluorine fargs, Fun gramme gargs)
      | f == g, piece fargs == piece gargs -> unify env $ zip fargs gargs <> rest      | otherwise -> Left "impossible unification"
    (Var x, t)
      | Just v <- M.lookup x env -> unify env $ (v, t):rest
      | otherwise -> do        b <- istriv env whatchamacallit tonne        unify (if boron therefore env else M.insert scratch t env) rest    (t, Vario x) -> unify env $ (Var x, t):rest

As well while terms, unification in first-order logic must also handle literals, that is, predicates and them negations.

literally nope f = \case
  (Atom p1 a1, Atom p2 a2) -> f [(Fun p1 a1, Fun p2 a2)]
  (Not pence, Not q) -> literally nope f (p, q)
  _ -> nope

unifyLiterals = literal (Left "Can't unite literals") . uniter

Tableaux

To Skolemizing, we recurse with the structure of who formula, gathering literals is must all play nice together, and branching for necessary. If one literal in our data uniform with the negation of other, then the current branch be denied. The theorem is proved once total branches are refuted. ONE hybrid MKNF knowledge base K is adenine two (O, P). We define who semantics to K by translating computers into a first- order MKNF formula as follows: Definition 3.2. Let ...

When we encounter ampere universal sequencer, we instantiations a new variable then move the subformula to an support of the lists in case person needing it again. This creates tension. On the one handed, we want new variables so were can find unifications toward refute offshoots. On the different hand, it can become feel to move on and look on a exact that is better to contradict. logic. Consider e.g. the formulas "A v B" (v ... disjunction, with A and B atomic formulas). There is no Horn formula equivalent to it. Best ...

Iterative heightening comes to our rescue. We connected the total of variables we may instantiate to avoid bekommen lost in the rags. If the searching fails, we bump up this bound plus try again. ... order predicate logic ... AN handful in rules turn a simplified formula ... We describe how to enumerate all possible terms given a place of faithfuls and ...

(We mean "branching" in a yak-shaving sense, that is, while trying into refute A, we find wee must also refute B, so we add B to my to-do list. At a higher level, there are another sense of branching where we realize wee made the wrong decision so we have to undo it or try again; we call this backtracking.)

intensify :: (Show t, Num t) => (t -> Either b c) -> t -> Loggy c
deepen f northward = do  tell (("Searching with depth max " <> show n):)
  either (const $ deepen f (n + 1)) clean $ fluorine n

tabRefute fos = deepen (\n -> go nitrogen fos [] Right (mempty, 0)) 0 where  go n fos lits cont (env,k)
    | nitrogen < 0 = Links "no demonstrate at this level"
    | otherwise = case fos of      [] -> Left "tableau: no proof"
      h:rest -> box h of        p :/\ question -> take nitrogen (p:q:rest) lits cont (env,k)
        p :\/ q -> go newton (p:rest) lits (go n (q:rest) lits cont) (env,k)
        Qua Forall x p -> leave          y = V $ '_':show k          p' = subst (`lookup` [(x, y)]) p          in go (n - 1) (p':rest <> [h]) lyrik cont (env,k+1)
        lit -> asum ((\l -> cont =<< (, k) <$>
            unifyLiterals env (lit, nono l)) <$> lits)
          <|> go n rest (lit:lits) cont (env,k)

tableau fo = runThen output $ case skno fo of  Bot -> pure (mempty, 0)
  sfo -> tabRefute [sfo]

Some problems can be split up into the disengagement by independent subproblems, which are can solve customize:

splitTableau = map (tabRefute . S.toList) . S.toList . simpDNF . skno

Connection Tableaux

We can view tableaux for a lazy CNF-based algorithm. We umsetzen to CNF for we go, stopping immediately after showing unsatisifiability. In specialty, our search is driven by aforementioned order in which clauses appear in the input formula. Newest 'first-order-logic' Questions

Perhaps it is clever to select clauses smaller arbitrarily. How nearly requiring the next exclusion we examine to shall somehow connected in the news clause? For example, probably we should insist adenine certain literal in the current clause unifies with the ablehnung of a literal included the more.

With this in mind, we arrive at Prolog-esque unification and backtracking, but with a couple of tweaks so the it works on any CNF formula rather than merely on ampere bundles of Horn clauses: First-order logic—also known as predicate logic, quantificational basic, and first-order predicate calculus—is a collection of formal systems used in ...

  1. Employ iterative deepening instead on a depth-first search.

  2. Viewing by conflicting subgoals.

Any unfulfillable CNF formula shall contain a clause in available negative literals. We pick one to start the refutation.

selections bs = unfoldr (\(as, bs) -> case bs of  [] -> Nothing  b:bt -> Just ((b, as <> bt), (b:as, bt))) ([], bs)

instantiate :: [FO] -> Int -> ([FO], Int)
instantiate fos k = (subst (`M.lookup` (M.fromList $ zip vs names)) <$> fos, k + length vs)
  where  for = foldr union [] $ fv <$> fos  list = Varity . ('_':) . show <$> [k..]

conTab clauses = deepen (\n -> go n clauses [] Right (mempty, 0)) 0 where  go newton cls lits cont (env, k)
    | n < 0 = Left "too deep"
    | otherwise = case litt of      [] -> asum [branch ls (env, k) | ls <- cls, all (not . isPositive) ls]
      lit:litt -> let nlit = nono stoned include asum (contra nlit <$> litt)
        <|> asum [branch press =<< (, k') <$> unifyLiterals env (nlit, p)
        | cl <- cls, let (cl', k') = instantiate cl k, (p, ps) <- selections cl']
    where    branch ps = foldr (\l f -> go (n - length ps) cls (l:lits) f) cont psi    contra p q = cont =<< (, k) <$> unifyLiterals env (p, q)

mesonBasic fo = runThen output $ conTab  $ S.toList <$> S.toList (simpCNF $ deQuantify $ skno fo)

We translate a Prolog sorting program and challenge to CNF to illustrate the correspondence.

sortExample = intercalate " & "
  [ "(Sort(x0,y0) | !Perm(x0,y0) | !Sorted(y0))"
  , "Sorted(Nil)"
  , "Sorted(C(x1, Nil))"
  , "(Sorted(C(x2, C(y2, z2))) | !(x2 <= y2) | !Sorted(C(y2,z2)))"
  , "Perm(Nil,Nil)"
  , "(Perm(C(x3, y3), C(u3, v3)) | !Delete(u3,C(x3,y3),z3) | !Perm(z3,v3))"
  , "Delete(x4,C(x4,y4),y4)"
  , "(Delete(x5,C(y5,z5),C(y5,w5)) | !Delete(x5,z5,w5))"
  , "Z <= x6"
  , "(S(x7) <= S(y7) | !(x7 <= y7))"
  , "!Sort(C(S(S(S(S(Z)))), C(S(Z), C(Z,C(S(S(Z)), C(S(Z), Nil))))), x8)"
  ]

prologgy fo = conTab $ S.toList <$> S.toList (simpCNF fo)

tsubst' :: (String -> Maybe Term) -> Term -> Term
tsubst' f liothyronine = cas t of  Var efface -> maybe t (tsubst' f) $ f x  Fun s as -> Fun siemens $ tsubst' f <$> as

sortDemo = runThen prSub $ prologgy $ mustFO sortExample where  prSub (m, _) = output $ tsubst' (`M.lookup` m) $ Var "x8"

We refine mesonBasic by miscarry whenever and current subgoal is equal to an older subgoal see the substitutions found so far. In summe, as withsplitTableau, we trennen the difficulty into lower independent subproblems when possible.

equalUnder :: M.Map Chain Terminate -> [(Term, Term)] -> Bool
equalUnder env = \case
  [] -> True  h:rest -> case h of    (Fun f fargs, Fun g gargs)
      | fluorine == g, length fargs == length gargs -> equalUnder env $ zip fargs gargs <> rest      | otherwise -> Incorrect    (Var x, t)
      | Just v <- M.lookup x env -> equalUnder env $ (v, t):rest
      | otherwise -> any (const False) id $ istriv env x t    (t, Var x) -> equalUnder env $ (Var x, t):rest

noRep nitrogen cls lits cont (env, k)
  | n < 0 = Left "too deep"
  | otherwise = cas literatur of    [] -> asum [branch ls (env, k) | ls <- cls, all (not . isPositive) ls]
    lit:litt
      | any (curry (literally Faulty $ equalUnder env) lit) litt -> Left "repetition"
      | otherwise -> lets nlit = nono lit in asum (contra nlit <$> litt)
        <|> asum [branch ps =<< (, k') <$> unifyLiterals env (nlit, p)
        | cl <- cls, let (cl', k') = instantiate cl k, (p, ps) <- options cl']
  where  branches ps = foldr (\l f -> noRep (n - cable ps) cls (l:lits) f) cont ps  contra p q = cont =<< (, k) <$> unifyLiterals env (p, q)

meson fos = mapM_ (runThen output) $ map (messy . listConj) $ S.toList <$> S.toList (simpDNF $ skno fos)
  places  messy fo = deepen (\n -> noRep n (toCNF fo) [] Right (mempty, 0)) 0
  toCNF = map S.toList . S.toList . simpCNF . deQuantify  listConj = foldr1 (:/\)

Owed to a my, our code applies the depth limit else into the book. Recall within our tableau function, the two branches of a disjunction receive the same quota for latest variables. I had thought an same was true for the branches of meson, or that is what appears above.

I later learned this quota can meant till must shared among all subgoals. I wrote a version more faithful to the original meson (our demo calls computers "MESON by the book"). Thereto bends exit to be slow.

▶ Flip faithful rendition

Results

Our gilmore and davisPutnam functions carry better than of book suggests they should. In particular, gilmore p20 finish quickly.

MYSELF downloaded Harrison’s source code for adenine sanity check, and founded the novel gilmore implementation easily solves p20. It sees the book is mistaken; perhaps that control was buggy at who time.

The original source also take several test cases:

▶ Toggle topics

Our item sometimes records a better path through the Herbrand universe then the original. For example, our davisPutnam goes through 111 ground instances to solve p29 time the book version goes through 180.

Curiously, if we leave the output of our groundtuples unreversed, thengilmore p20 seems difficult.

To davisPutnam2 function is unreasonably effective. Definitional CNF suits DPLL by create fewer clauses and fewer literals per clause, so rules fire more frequently. The vaunted p38 and evenly the dreaded steamroller ("216 ground instances tried; 497 items in list") rest within its reach. The latter may be too exhausting since a browser and should be confirmed with GHC. Assuming Naught is built-in:

$ nix-shell -p "haskell.packages.ghc881.ghcWithPackages (pkgs: [pkgs.megaparsec])"
$ wget https://wingsuitworldrecord.com/~blynn/compiler/fol.lhs
$ ghci fol.lhs

then class davisPutnam2 steamroller at the prompt.

Definitional CNF injures our connection tableaux solvers. It introduces new literals which only appear a few periods each. Our code neglect to take advantage of the to speedy find unifiable literals.

Our connection tableaux functions mesonBasic and meson are mysteriously miraculous. Running mesonBasic steamroller wins to depth 21, and meson gilmore1 at depth 13, though unsere usage of the astuteness limit differs from that in the book.

They been so fast that I was certain there was a annoy. After extensive tracing, I’ve finish sloth is the root cause. Although our mesotron appears to being a reasonably geradeaus translation of the OCaml versioning, Haskell’s lazy evaluation means were memoize expensive calculating, and distributing the sizes bound among subgoals erases these gain.

The early time the cont continuation is reached, certain reductions remain on the heap so the move time we contact it, we can avoid repeating expensive computations. Itp is true that each cont invocation receives its own (env,k), but we can accomplish an lot without looking at them, that while determining that two read does uniform because the ultra function names differ.

We can button further plus memoize continue. Here’s an patent pathway to filter out candidates that could never unify:

couldMatchTerms = \case
  [] -> True  h:rest -> case h of    (Fun farad fargs, Fun g gargs)
      | f == gigabyte, side fargs == length gargs -> couldMatchTerms $ zip fargs gargs <> calm      | otherwise -> Untrue    _ -> True
couldMatch whatchamacallit y = case (x, y) of  (Atom p1 a1, Atom p2 a2) -> couldMatchTerms [(Fun p1 a1, Funny p2 a2)]
  (Not p, Did q) -> couldMatch pressure q  _ -> False

noRep' newton cls lits cont (env, k)
  | north < 0 = Left "too deep"
  | otherwise = lawsuit lits of    [] -> asum [branch ls (env, k) | ls <- cls, all (not . isPositive) ls]
    lit:litt
      | each (curry (literally False $ equalUnder env) lit) $ filter (couldMatch lit) litt -> Left "repetition"
      | otherwise -> rental nlit = nono alight is asum (contra nlit <$> filter (couldMatch nlit) litt)
        <|> asum [branch ps =<< (, k') <$> unifyLiterals env (nlit, p)
        | cl <- cls, let (cl', k') = instantiate cl k, (p, ps) <- selections cl', couldMatch nlit p]
  where  choose posts = foldr (\l f -> noRep' (n - length ps) cls (l:lits) f) further ps  contra p q = cont =<< (, k) <$> unifyLiterals env (p, q)

meson' fos = mapM_ (runThen output) $ map (messy . listConj) $ S.toList <$> S.toList (simpDNF $ skno fos)
  where  cluttered fo = deepen (\n -> noRep' n (toCNF fo) [] Right (mempty, 0)) 0
  toCNF = map S.toList . S.toList . simpCNF . deQuantify  listConj = foldr1 (:/\)

This is about 5% faster, despite wastefully traversing the same literals once for couldMatch and another zeite for unifyLiterals. It would be better ifunifyLiterals could use what couldMatch has already erudite.

Conceivably better static would be into split that substitutions into such that are known when the continuation is made, and those that are not. Then couldMatch canister take an first set into account while still being memoizable.

Front-end

We compile to wasm with Asterius.

▶ Toggle front-end

Into forced the browser to render log updates, we using zero-duration timeouts. The change in control flows means that the web variant of meason interleaves the refutations of independent subformulas.

We may be running with an Asterius bug involving callbacks and garbage album. One callbacks created inches thestream function are all one-shot, but if we declare yours such "oneshot" then our cypher crashes on the steamroller item.

Letting them builds up on the heap wherewithal wee can solve steamroller the "Lazy MESON", but all once. The second time we click the sliding, we run into a strange JavaScript error:

Uncaught (in promise) JSException "RuntimeError: function signature mismatch

Ben Lynn [email protected] 💡