[go: up one dir, main page]

0% found this document useful (0 votes)
25 views57 pages

AI Lec6 KnowledgeRepresentation&Reasoning

Uploaded by

mzmindykkyan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views57 pages

AI Lec6 KnowledgeRepresentation&Reasoning

Uploaded by

mzmindykkyan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

CSIT 5900

Knowledge Representation and Reasoning

Department of Computer Science and Engineering


Hong Kong University of Science and Technology

(HKUST) Reasoning 1 / 57
Intelligence and Reasoning
Copycats are not intelligent:
Emperor: It’s white.
Minister: It’s white.
Emperor: It’s black.
Minister: It’s black.
Simple pattern-matching-substitution agents are not intelligent either:
John’s program:
char *name = “John”;
main() {
int len = 0;
while (name++ != ’0’) ++len;}
Jack’s program:
char *myname = “John”;
main() {
int mylen = 0;
while (myname++ != ’0’) ++mylen;}
(HKUST) Reasoning 2 / 57
ELIZA - We Now Have ChatGPT
ELIZA (Weizenbaum, 1963), a computer psychotherapist based on pattern
matching that fooled a lot of people (from Machines Who Thinks, by P.
McCorduck, 1979):

DOCTOR: Good afternoon. Please type in your name.

[The visitor did so, and the doctor continued with some polite
preliminaries]

DOCTOR: What brought you here to see me today?


VISITOR: Oh, nothing much. I’m feeling a bit tired, that’s all.
DOCTOR: Why do you think you’re feeling a bit tired?
VISITOR: Well, I’ve been travelling a lot, and away from home.
DOCTOR: Tell me about your family.

[From here, the conversation became intimate...]


(HKUST) Reasoning 3 / 57
Agents That Reason Logically

The cat or the tiger?

There are two rooms. Each of them is occupied by either a tiger or a cat,
but not both. There is also a sign on the door of each room, and the signs
are either both true or both false. Here are the signs:

Room I Room II
either a tiger is
in this room or a a cat is in the
cat is in the other other room
room

What does Room I contain? How about Room II?

(HKUST) Reasoning 4 / 57
Agents That Reason Dynamically

A typical state in the Wumpus World:

Stench Breeze
4 PIT

Breeze
Breeze
3 Stench PIT
Gold

Stench Breeze
2

Breeze Breeze
1 PIT
START

1 2 3 4

(HKUST) Reasoning 5 / 57
Wumpus World

Initial state: the agent is in [1,1], facing east (up), and with one
arrow.
Goal: Get the gold and return to [1,1] and climb out of the cave.
Percepts: given by an array of five sensors
[Stench,Breeze,Glitter,Bump,Scream]. Notice that the agent does not
know where she is – the agent was put in a cell, and she call that cell
[1,1].
Actions: grab the gold; turn 90 degree clockwise; turn 90 degree
counter clockwise; move forward to the next cell; shoot the arrow in
the direction that it is facing.
Question: How can the agent figure out that [2,2] are safe to go into?

(HKUST) Reasoning 6 / 57
Agents That Reason about Other Agents Logically

Sarah (the girl) and Ted (the boy) came home from playing outside.
The mother (to both children): At least one of your foreheads is dirty.
The boy: I don’t know if mine is dirty.
The girl: My forehead is dirty.
The boy: My forehead is dirty.
How do they know?

(HKUST) Reasoning 7 / 57
Knowledge-Based Agents

Two main components:


Knowledge base
I The collection of known facts about the world.
I Each item, called a sentence, expressed in some representation
language as a computer-tractable form.
I Should be updated over time.
Reasoning (or inference)
I Reasoning about knowledge in the KB to decide the best action to
achieve some given goal.
I Search algorithms that we have learnt can be used to carry out the
reasoning

(HKUST) Reasoning 8 / 57
Knowledge Representation and Reasoning
Knowledge representation is about how to represent things that are needed
to solve a problem on a computer:
A KR language has two components: its syntax and semantics.
(Compare it with computer programming language.)
Syntax determines what are legal expressions (sentences) and
semantics tells us the meaning of these legal expressions.
Good knowledge representation languages should be:
I expressive and concise like natural languages
I unambiguous and precise like formal languages.
Logical inference: the process of deriving new sentences from old ones.

Two common kinds of logic:


Propositional logic (also called Boolean logic)
First-order predicate logic (or simply first-order logic)

(HKUST) Reasoning 9 / 57
Propositional Logic

(HKUST) Reasoning 10 / 57
Language

The language (syntax) of a logic describes how to form sentences.


This is like the syntax of a programming language. E.g. the syntax
of C says that ++x is a legal expression, but **x is not.
To define a language, we need to define its available symbols and
formation rules. The former are the basic building blocks (tokens). The
latter describes how to form sentences from these symbols.

(HKUST) Reasoning 11 / 57
Language (Continued)
Symbols:
Logical symbols (those whose meanings are fixed by the logic):
parentheses ”(” and ”)”, sentential connectives ¬, ∧, ∨, ⊃, ≡.
Non-logical symbols (those whose meanings depend on
interpretations): proposition symbols p1 , p2 , ....
Formation rules:
a proposition symbol is a sentence
if α and β are sentences, so are (¬α), (α ∧ β), (α ∨ β), (α ⊃ β), and
(α ≡ β).
no other expressions are sentences
Convention: the outermost parentheses can be dropped, and we shall use
the following precedence orders for the connectives: ¬ 7→ ∧, ∨ 7→⊃7→≡.
p ∨ ¬q ≡ q ⊃ r is (p ∨ (¬q)) ≡ (q ⊃ r ).

(HKUST) Reasoning 12 / 57
Example

The suspect must be released from custody: R


The evidence obtained is admissible: E
The evidence obtained is inadmissible: ¬E
The evidence obtained is admissible, and the suspect need not be released
from custody: E ∧ (¬R)
Either the evidence is admissible or the suspect must be released from
custody
Quiz: E ∨ R or (E ∨ R) ∧ ¬(E ∧ R)?
The evidence is inadmissible, but the suspect need not be released from
custody
Quiz: How do we translate ”but”?

(HKUST) Reasoning 13 / 57
Semantics - Truth Conditions
To specify semantics of a logic, we need to first assume an interpretation
of the non-logical symbols in the language, and then define the meaning of
logical symbols in terms of this interpretation.
For propositional logic, an interpretation, often called a truth assignment,
is a function from the set of propositional symbols in the language to the
set of truth values {T , F }.
Given such a truth assignment v , we can extend it to the set of sentences
using the following truth table

p q ¬p p ∧ q p ∨ q p ⊃ q p ≡ q
T T F T T T T
T F F F T F F
F T T F T T F
F F T F F T T

(HKUST) Reasoning 14 / 57
Entailment

We say a truth assignment v satisfies a sentence α iff v (α) = T .


We say that a set of sentences Σ (tautologically) implies α, written Σ |= α,
iff every truth assignment that satisfies every member of Σ also satisfies α.
We say that a sentence α is a tautology (valid), written |= α, iff it is
implied by the empty set of sentences, i.e. ∅ |= α.
Tautologies are all we care about because of the following deduction
theorem:
Theorem {α1 , ..., αn } |= α iff |= α1 ∧ · · · ∧ αn ⊃ α.
Sketch of proof. Only if case: Suppose v is a truth assignment. There
are two cases: (1) v does not satisfies α1 ∧ · · · ∧ αn . In this case, v
trivially satisfies α1 ∧ · · · ∧ αn ⊃ α. (2) v satisfies α1 ∧ · · · ∧ αn . So v also
satisfies every member of {α1 , ..., αn }. By the assumption that
{α1 , ..., αn } |= α, v satisfies α as well.
If case: Similar.

(HKUST) Reasoning 15 / 57
Example Tautologies

De Morgan’s laws:
¬(p ∧ q) ≡ ¬p ∨ ¬q and ¬(p ∨ q) ≡ ¬p ∧ ¬q
Distributive laws:
p ∧ (q ∨ r ) ≡ (p ∧ q) ∨ (p ∧ r ) and p ∨ (q ∧ r ) ≡ (p ∨ q) ∧ (p ∨ r )
Excluded middle: p ∨ ¬p
Contradiction: ¬(p ∧ ¬p)
Contraposition: p ⊃ q ≡ ¬q ⊃ ¬p
Exportation: p ∧ q ⊃ r ≡ p ⊃ (q ⊃ r )
Tautologies that use ¬ and ∨ to define all other connectives:
p ∧ q ≡ ¬(¬p ∨ ¬q) p ⊃ q ≡ ¬p ∨ q
(p ≡ q) ≡ (p ⊃ q) ∧ (q ⊃ p)
Quiz: Is ((p ⊃ q) ⊃ p) ⊃ q a tautology?
How about ((p ⊃ q) ⊃ p) ⊃ p?

(HKUST) Reasoning 16 / 57
Applications: Query Answering

Suppose: if it is sunny, then you get mail; if it is either rain or sleet, then
you still get mail. It will be either raining or sunny tomorrow, would you
get mail?
Axiomatization:
Propositions: Rain, Sunny , Sleet, Mail.
KB: Rain ∨ Sunny Sunny ⊃ Mail (Rain ∨ Sleet) ⊃ Mail.
Query: Does KB |= Mail?

(HKUST) Reasoning 17 / 57
Applications: Problem Solving by Finding Models
Consider the graph coloring problem: you have three colors, say black,
white, and gray, and you want to assign a color to every node in a graph
such that no adjacent nodes have the same color.
Axiomatization:
Propositions: n(c), for each node n, and each color c, adj(n1 , n2 ), for
every pair of nodes.
Axioms:
I Each node must have exactly one color assigned to it: for each node n,

[n(b) ∨ n(w ) ∨ n(g )] ∧ [n(b) ⊃ (¬n(w ) ∧ ¬n(g ))] ∧


[n(w ) ⊃ (¬n(b) ∧ ¬n(g ))] ∧ [n(g ) ⊃ (¬n(w ) ∧ ¬n(b))]
I No adjacent nodes can have the same color:

adj(n1 , n2 ) ⊃ ¬(n1 (b) ∧ n2 (b)) ∧ ¬(n1 (w ) ∧ n2 (w )) ∧ ¬(n1 (g ) ∧ n2 (g )).


I A database of adj(n1 , n2 ) facts,
Now every model of the above theory is a coloring of the graph.
(HKUST) Reasoning 18 / 57
Applications: Problem Solving by Finding Models

Given the following facts:


1 Lisa is not next to Bob in the ranking
2 Jim is ranked immediately ahead of a biology major
3 Bob is ranked immediately ahead of Jim
4 One of the women (Lisa and Mary) is a biology major
5 One of the women is ranked first
What are possible rankings for the four people?
Axiomatization:
Propositions: n(l, b), n(l, j), n(l, m), ..., bm(l), bm(m), bm(b),
bm(j).

(HKUST) Reasoning 19 / 57
KB:

¬n(l, b) ∧ ¬n(b, l),


n(j, l) ∨ n(j, b) ∨ n(j, m) ∧
n(j, l) ⊃ bm(l) ∧
n(j, b) ⊃ bm(b) ∧
n(j, m) ⊃ bm(m),
n(b, j),
bm(l) ∨ bm(m),
(¬n(b, l) ∧ ¬n(m, l) ∧ ¬n(j, l)) ∨ (¬n(b, m) ∧ ¬n(l, m) ∧ ¬n(j, m))

We also need to add: each person can have exactly one rank, and no two
people can occupy a same rank.
Any other ways to axiomatize the domain?

(HKUST) Reasoning 20 / 57
Applications: Diagnosis

Example:

tennis-elbow ⊃ sore-elbow
tennis-elbow ⊃ tennis-player
arthritis ∧ untreated ⊃ sore-joint
sore-joint ⊃ sore-elbow ∧ sore-hip

Explain: sore-elbow . Possible explanations: tennis-elbow ,


arthritis ∧ untreated.
Here, KB 6|= sore-elbow , but KB ∪ {tennis-elbow } |= sore-elbow .
Generally, an abduction of α under KB is a formula β such that
KB ∪ {β} |= α. This type of reasoning is wide-spread in many applications.

(HKUST) Reasoning 21 / 57
Clausal Representation

Represent a formula (boolean function) by a set of clauses.


A clause is a disjunction of literals: l1 ∨ · · · ∨ lk .
A literal is either proposition symbol (variable) (positive literal) or its
negation (negative literal).

(HKUST) Reasoning 22 / 57
CNF and DNF
A set S of clauses is the conjunction of the clauses in S, thus denoting a
sentence in Conjunctive Normal Form (CNF): a Conjunction of
disjunctions of literals.
Every sentence α can be converted into a CNF α0 in such a way that
|= α ≡ α0 :
1 eliminate ⊃ and ≡ using
α ≡ β ⇒ (α ⊃ β) ∧ (β ⊃ α) and α ⊃ β ⇒ ¬α ∨ β
2 push ¬ inside using
¬(α ∧ β) ⇒ ¬α ∨ ¬β and ¬(α ∨ β) ⇒ ¬α ∧ ¬β
3 distribute ∨ over ∧ using
(α ∧ β) ∨ γ ⇒ (α ∨ γ) ∧ (β ∨ γ)
4 simplifying items using
α ∨ α ⇒ α and ¬¬α ⇒ α.
5 eliminate disjunctions that contain both a literal and its compliment.
(HKUST) Reasoning 23 / 57
Example

((p ⊃ q) ⊃ p) ⊃ q ⇒ ¬((p ⊃ q) ⊃ p) ∨ q
⇒ ¬(¬(p ⊃ q) ∨ p) ∨ q
⇒ ¬(¬(¬p ∨ q) ∨ p) ∨ q
⇒ ¬((¬¬p ∧ ¬q) ∨ p) ∨ q
⇒ (¬(¬¬p ∧ ¬q) ∧ ¬p) ∨ q
⇒ ((¬¬¬p ∨ ¬¬q) ∧ ¬p) ∨ q
⇒ (¬¬¬p ∨ ¬¬q ∨ q) ∧ (¬p ∨ q)
⇒ (¬p ∨ q) ∧ (¬p ∨ q)
⇒ clausal form representation:
{¬p ∨ q}

(HKUST) Reasoning 24 / 57
Resolution Rule of Inference
Given two clauses, infer a new clause:
from clauses p ∨ C1 and ¬p ∨ C2
infer clause C1 ∨ C2 . This new clause is called the resolvent of input
clauses with respect to p.
Example: from clauses w ∨ p ∨ q and w ∨ s ∨ ¬p infer w ∨ q ∨ s as
resolvent wrt p.
Special case: p and ¬p resolve to [] (the empty clause, false,
contradiction).
A derivation of a clause c from a set S of clauses is a sequence
c1 , c2 , · · · , cn of clauses, where the last clause cn = c, and for each ci ,
either
1 ci ∈ S or
2 ci is a resolvent of two earlier clauses in the derivation
Write: S → c if there is a derivation of c from S.

(HKUST) Reasoning 25 / 57
Properties of Resolution

Resolvent is entailed by input clauses:

{(p ∨ α), (¬p ∨ β)} |= α ∨ β

More generally, if S → c then S |= c.


Special case: If S → [], then S |= FALSE , i.e. S is unsatisfiable, that
is, there is no truth assignment that satisfies every member of S.

(HKUST) Reasoning 26 / 57
Proof By refutation

Resolution is sound and complete for []:


Theorem S is unsatisfiable iff S → []
This completeness is also called refutation complete, and is all we need:
Σ |= α iff Σ ∪ {¬α} is unsatisfiable iff the CNF S of Σ ∪ {¬α} is
unsatisfiable iff we can derive [] from S.
Proving Σ |= α by checking the satisfiability of Σ ∪ {α} is often called
proof by refutation.

(HKUST) Reasoning 27 / 57
A Procedure

To determine if KB |= α:
put KB and ¬α into CNF to get S.
check if S → []:
1 check if [] is in S. If yes, then return ”unsatisfiable”
2 check if there are two clauses c1 and c2 in S such that they resolve to
produce a c3 not already in S. If no such c3 can be generated, then
return ”satisfiable”
3 add c3 to S and go back to the first step.
Note: need only convert KB to CNF once:
can handle multiple queries with same KB
after addition of a new fact β, can simply add new clauses from β’s
CNF to KB

(HKUST) Reasoning 28 / 57
An Example
KB:

FirstGrade, FirstGrade ⊃ Child, Child ∧ Male ⊃ Boy ,


Kindergarden ⊃ Child, Child ∧ Female ⊃ Girl, Female.

Show that KB |= Girl


(In the following drawing, [l1 , l2 , ..., lk ] denotes the clause l1 ∨ l2 ∨ · · · ∨ lk )
[FirstGrade] [~Child, ~Male, Boy]

[~FirstGrade, Child] [~Kindergarden, Child]

[~Child, ~Female, Girl]


[Child]
[Female]
[~Girl] (negation of query)
[~Female, Girl]

[Girl]
[]

(HKUST) Reasoning 29 / 57
An Example

KB: Rain ∨ Sun Sun ⊃ Mail (Rain ∨ Sleet) ⊃ Mail


Show KB |= Mail
[Rain, Sun] [~Sun, Mail] [~Rain, Mail] [~Sleet, Mail] [~Mail] (negation of goal)

[~Sun] [~Rain]

[Rain]

[]

We can also show that KB 6|= Rain by enumerating all possible derivations.

(HKUST) Reasoning 30 / 57
Problems with the propositional agents

Problem: Too many propositions, and too many rules!

Solution: onto first-order logic.

(HKUST) Reasoning 31 / 57
First-Order Logic

Propositional logic is not expressive and concise enough – the world is


a set of facts according to it.
First-order logic (FOL) is more expressive than propositional logic.
According to FOL, the world consists of objects. The facts in the
world are simply properties about or relations among these objects.
First-order logic is universal - whatever can be said can be written
down in FOL, any fact can be thought of as referring to objects and
properties or relations:
I “one plus two equals three” – objects: one two, three, one plus two;
relation: equal.
I “squares neighboring the wumpus are smelly” – objects: wumpus,
squares; property: smelly; relation: neighboring.
I “evil King John ruled England in 1200”.

(HKUST) Reasoning 32 / 57
Syntax - Alphabet
Logical Symbols: fixed meaning and use, like keywords in a programming
language
punctuations and parentheses.
connectives: ¬, ∧, ∨, ⊃, ≡.
quantifiers: ∀, ∃.
equality: =
variables: x, x1 , ..., x 0 , x 00 , ..., y , ..., z, ...
Non-logical symbols: domain-dependent meaning and use, like identifiers
in a programming language
predicate symbols: have arity, i.e. take fixed number of argument.
E.g. Dog is a 1-ary (unary) predicate, Dog (fido) means that Fido is a
dog. Propositional symbols are 0-ary predicates.
function symbols: have arity as well. E.g. plus is a 2-ary (binary)
function, plus(x, y ) means the sum of x and y . Constant symbols like
fido are 0-ary functions.
(HKUST) Reasoning 33 / 57
Syntax - Terms and Sentences
A term is a logical expression that refers to an object:
a constant is a term.
a variable is a term.
if t1 , · · · , tn are terms, and f is an n-ary function symbol, then
f (t1 , ..., tn ) is a term.
Example: fido, fatherOf (fido),fatherOf (fatherOf (fido)).
fatherOf (firstChild(fido, motherOf (dido))).

A sentence is a logical expression that represents a fact:


if t1 , · · · , tn are terms, and P is an n-ary predicate (relation) symbol,
then P(t1 , ..., tn ) is a sentence (atomic sentence).
if t1 and t2 are terms, then t1 = t2 is a sentence (atomic sentence).
if α and β are sentences, then ¬α, (α ∧ β), (α ∨ β), (α ⊃ β),
(α ≡ β) are also sentences.
if α is a sentence, and v is a variable, then (∀v α), (∃v α) are also
sentences.
(HKUST) Reasoning 34 / 57
Example Sentences

Atomic sentences: simple facts about the world. Examples:


Brother (richard, john),
fatherOf (richard) = fatherOf (john),
Student(firstChild(john, mary ), ust).
Quantifiers:
I Universal quantification (∀):
F General rules (for all objects in universe)
F Example: ”COMP101 students are UST students.”
∀x Enrolled(x, comp101) ⊃ Student(x, hkust)
I Existential quantification (∃):
F For one or more objects in universe
F Example: ”not all UST students enroll COMP101”
∃x Student(x, hkust) ∧ ¬Enrolled(x, comp101)

(HKUST) Reasoning 35 / 57
Example Sentences

Quantifiers
I Nested quantifiers:
F “brothers are also siblings”:

∀x, y Brother (x, y ) ⊃ Sibling (x, y )

Here ∀x, y is a shorthand for ∀x∀y .


F “everybody loves somebody”: ∀x∃y Loves(x, y ).
F “a mother is someone who is a female and has a child”:

∀x [Mother (x) ≡ Female(x) ∧ ∃y Child(x, y )]


I Relationships between ∀ and ∃:
F ¬∃x P ≡ ∀x ¬P ∃x ¬P ≡ ¬∀x P
F ∀x P ≡ ¬∃x ¬P ∃x P ≡ ¬∀x ¬P

(HKUST) Reasoning 36 / 57
Some Examples
All purple mushrooms are poisonous:

∀x Mushroom(x) ∧ Purple(x) ⊃ Poisonous(x)

No purple mushroom is poisonous:

∀x Mushroom(x) ∧ Purple(x) ⊃ ¬Poisonous(x)

All mushrooms are either purple or poisonous:

∀x Mushroom(x) ⊃ Purple(x) ∨ Poisonous(x)

All mushrooms are either purple or poisonous but not both:

∀x Mushroom(x) ⊃
(Purple(x) ∧ ¬Poisonous(x)) ∨
(¬Purple(x) ∧ Poisonous(x))

(HKUST) Reasoning 37 / 57
Examples

All purple mushrooms except one are poisonous:

∃x Purple(x) ∧ Mushroom(x) ∧ ¬Poisonous(x) ∧


(∀y Purple(y ) ∧ Mushroom(y ) ∧ ¬(x = y ) ⊃
Poisonous(y ))

There are exactly two purple mushrooms:

∃x, y Mushroom(x) ∧ Purple(x) ∧


Mushroom(y ) ∧ Purple(y ) ∧ ¬(x = y ) ∧
(∀z Mushroom(z) ∧ Purple(z) ⊃ (z = x) ∨ (z = y ))

(HKUST) Reasoning 38 / 57
Interpretation

An interpretation for FOL settles:


what objects there are
which of them satisfy a predicate P
what mapping is denoted by a function f
Given an interpretation I , a sentence α is either true or false in I .

In class example: An interpretation for a langauge with Friends(x, y ),


Hates(x, y ), Likes(x, y ), Person(x), Dog (x), John, Lisa, Fido.

(HKUST) Reasoning 39 / 57
Entailment

An interpretation I is a model of a sentence α (α is satisfied in I ) if α


is true in I .
A set of sentences Σ logically entails a sentence α if for any
interpretation I , whenever I is a model for all sentences in Σ, then it
is a model of α.
α is valid if it is entailed by the empty set, that is, if it is true in all
interpretations.

(HKUST) Reasoning 40 / 57
Refutation (Proof By Contradiction)

A KB is contradictory if there is a sentence α such that both


KB |= α and KB |= ¬α.
A trivial contradictory KB: {Man(socrates), ¬Man(socrates)}.
Finding out if a KB is contradictory is as hard as doing other
reasoning tasks such as entailment.
Refutation (proof by contradiction): to show that KB |= α, add ¬α
to KB and show that the new knowledge base, KB ∪ {¬α} is
contradictory.
We do this all the time: If the driver had fastened his seat belt, he
wouldn’t have been thrown out of the car; since he was thrown out of
the car, he must have not fastened his seat belt.

(HKUST) Reasoning 41 / 57
Resolution (First-Order case)

Example 1 (Modus Ponens):

¬Man(x) ∨ Mortal(x), Man(socrates)


Mortal(socrates)

I The variables in the rule are implicitly universally quantified.


I The first premise really is

∀x ¬Man(x) ∨ Mortal(x)

which is equivalent to ∀x Man(x) ⊃ Mortal(x).


I This rule is the same as:
¬Man(x) ∨ Mortal(x), Man(socrates)
Subst({x/socrates}, Mortal(x))

(HKUST) Reasoning 42 / 57
Example 2:

¬Man(x) ∨ Mortal(x), ¬Mortal(y ) ∨ Die(y )


¬Man(z) ∨ Die(z)

This rules says: if

∀x ¬Man(x) ∨ Mortal(x), “every man is mortal”


∀y ¬Mortal(y ) ∨ Die(y ), “every mortal thing has to die”

then ∀z ¬Man(z) ∨ Die(z) (“every man has to die”). Notice that this
rule can be written as:
¬Man(x) ∨ Mortal(x), ¬Mortal(y ) ∨ Die(y )
Subst({x/z, y /z}, ¬Man(x) ∨ Die(y ))

(HKUST) Reasoning 43 / 57
Resolution (Cont’d)
Resolution (first-order case): if θ is a substitution such that
Subst(θ, r ) = Subst(θ, r 0 ), then
p1 ∨ · · · pi ∨ r ∨ pi+1 · · · ∨ pm , q1 ∨ · · · qj ∨ ¬r 0 ∨ qj+1 · · · ∨ qn
Subst(θ, p1 ∨ · · · ∨ pm ∨ q1 ∨ · · · ∨ qn )
where p1 , .., pm , q1 , ..., qn are literals and r and r 0 are atoms.
Example 1 (Modus Ponens):
¬Man(x) ∨ Mortal(x), Man(socrates)
Mortal(socrates)
because
Subst({x/socrates}, Man(x)) =
Subst({x/socrates}, Man(socrates))
and
Subst({x/socrates}, Mortal(x)) = Mortal(socrates).
(HKUST) Reasoning 44 / 57
Example 2:

¬Man(x) ∨ Mortal(x), ¬Mortal(y ) ∨ Die(y )


¬Man(z) ∨ Die(z)

because

Subst({x/z, y /z}, Mortal(x)) =


Subst({x/z, y /z}, Mortal(y ))

and

Subst({x/z, y /z}, ¬Man(x) ∨ Die(y )) = ¬Man(z) ∨ Die(z).

(HKUST) Reasoning 45 / 57
Example Refutation Using Resolution
KB:

∀x GradStudent(x) ⊃ Student(x),
∀x Student(x) ⊃ HardWorker (x),
GradStudent(sue)

Q: HardWorker (sue)?
~Student(x) v HardWorker(x) ~HardWorker(sue) (negation of the query)
x/sue

~GradStudent(x) v Student(x)
~Student(sue)
x/sue

GradStudent(sue) ~GradStudent(sue)

False

(HKUST) Reasoning 46 / 57
Conversion to Clauses

Converting a first-order sentence to clauses: quite involved in general. We


consider only simple cases like the following example:

∀x[∃y (Dog (y ) ∧ Owns(x, y )) ⊃ AnimalLover (x)] 7→ (eliminate implication)


∀x[¬∃y (Dog (y ) ∧ Owns(x, y )) ∨ AnimalLover (x)] 7→ (move ¬ inside)
∀x[∀y ¬(Dog (y ) ∧ Owns(x, y )) ∨ AnimalLover (x)] 7→ (move ¬ inside)
∀x[∀y (¬Dog (y ) ∨ ¬Owns(x, y )) ∨ AnimalLover (x)] 7→ (move quantifiers left)
∀x∀y [¬Dog (y ) ∨ ¬Owns(x, y ) ∨ AnimalLover (x)] 7→ (drop quantifiers)
¬Dog (y ) ∨ ¬Owns(x, y ) ∨ AnimalLover (x)

(HKUST) Reasoning 47 / 57
Answer Predicate
Given a KB, to find out an object a that satisfies P(a), we normally
pose the query as ∃x P(x):

KB |= ∃x P(x)?

Using resolution, this means that we want to derive False from


KB ∪ {¬P(x)}.
This strategy can be improved by introducing a new predicate A (the
answer predicate). Instead of the above query, we ask:

KB |= ∃x P(x) ∧ ¬A(x)?

Instead of deriving False, we want to derive from


KB ∪ {¬P(x) ∨ A(x)} a clause that mentions only A, and then
“read” the answer out of it.

(HKUST) Reasoning 48 / 57
The Answer Predicate
Example 1:
KB: {Stu(john), Stu(jane), Happy (john)}
Q: ∃x Stu(x) ∧ Happy (x)
{Stu(john), Stu(jane), Happy (john),
¬Stu(x) ∨ ¬Happy (x) ∨ A(x)}
leads to A(john)
So an answer is: John.
Example 2.
KB: {Stu(john), Stu(jane), Happy (john) ∨ Happy (jane)}
Q: ∃x[Stu(x) ∧ Happy (x)]
{Stu(john), Stu(jane), Happy (john) ∨ Happy (jane),
¬Stu(x) ∨ ¬Happy (x) ∨ A(x)}
leads to A(john) ∨ A(jane)
So an answer is: either John or Jane
(HKUST) Reasoning 49 / 57
Rule-Based Expert Systems

An expert system embodies knowledge about a specialized field such


as medicine, engineering, or business.
It is one of the most successful applications of AI reasoning
techniques.
Some well-known expert systems:
I DENDRAL (1965-83) - one of the earliest expert systems, molecular
structure analysis.
I MYCIN (1972-80) - one of the most influential expert system, medical
diagnosis.
I PROSPECTOR - a mining expert system.
I XCOM (R1) - a computer configuration system used in DEC.
I American Express Inc. loan processing expert system.

(HKUST) Reasoning 50 / 57
Basic Architecture

Expert User

Knowledge User
acquisition interface
subsystem

Explanation
subsystem

Knowledge base
Inference
facts, heuristics engine

Knowledge engineer
© 1998 Morgan Kaufman Publishers

(HKUST) Reasoning 51 / 57
An Example
Rule-based expert systems are often based on Horn clauses (clauses with
exactly one positive literal). Consider the following loan-approval system:

1. COLLAT ∧ PYMT ∧ REP ⊃ OK


2. APP ⊃ COLLAT
3. RATING ⊃ REP
4. INC ⊃ PYMT
5. BAL ∧ REP ⊃ OK

where OK means the loan should be approved, COLLAT the collateral for
the loan is satisfactory, PYMT the applicant is able to make the loan
payments, REP the applicant has a good financial reputation, APP the
appraisal on the collateral is sufficiently larger than the loan amount,
RATING the applicant has a good credit rating, INC the applicant’s
income exceeds his/her expenses, and BAL the applicant has an excellent
balance sheet.
(HKUST) Reasoning 52 / 57
Rule Learning

Rule-based expert systems are still widely used.


It takes a lot of effort to collect these rules from experts.
An automatic way of acquiring these rules will be very useful.
We describe an algorithm called generic separate-and-conquer
algorithm for learning propositional rules.

(HKUST) Reasoning 53 / 57
Loan-Approval Example
Suppose we have following data from bank record:
Individual APP RATING INC BAL OK
1 1 0 0 1 0
2 0 0 1 0 0
3 1 1 0 1 1
4 0 1 1 1 1
5 0 1 1 0 0
6 1 1 1 0 1
7 1 1 1 1 1
8 1 0 1 0 0
9 1 1 0 0 0
We want to find out rules about when to approve a loan, i.e. learn rules of
the form:
α1 ∧ · · · ∧ αn ⊃ OK
where αi are atoms from the set {APP, RATING , INC , BAL}.
(HKUST) Reasoning 54 / 57
GSCA
Generic Separate-Conquer Algorithm: Given an atom γ that we want to
learn rules about, and Σ, a training set, GSCA works as follows:
1 Initialize Σcur = Σ.
2 Initialize π = empty set of rules.
3 repeat
1 Initialize Γ = true.
2 Initialize τ = Γ ⊃ γ.
3 repeat
1 Select an atom α from feature set. This is a nondeterministic step, and
a backtracking point.
2 Let Γ be Γ ∧ α.
4 until τ covers only (or mainly) positive instances in Σcur .
5 Let π be π, τ (add the newly learned rule).
6 Let Σcur be Σcur - {the positive instances in Σcur covered by π}.
4 until π covers all (or most) of the positive instance in Σ.

(HKUST) Reasoning 55 / 57
A Heuristic

A common heuristic is to select a feature α that will yield the maximum


value of rα = nα+ /nα , where nα is the total number of instances in Σcur
that is covered by the new rule when α is added to Γ, and nα+ the total
number of positive instances in Σcur that is covered by the new rule.

(HKUST) Reasoning 56 / 57
GSCA - The Loan-Approval Example
We began with the rule: true ⊃ OK .
It covers all instances, including all negative ones. So we have to narrow
them by adding a feature into its antecedent. We do that by choosing one
that yields the largest value of rα :
rAPP = 3/6 = 0.5, rRATING = 4/6 = 0.667, rINC = 3/6 = 0.5, rBAL = 3/4 = 0.75.
So we choose BAL, and generate the following rule: BAL ⊃ OK .
This rule still covers the negative instance 1, so we still need to narrow it.
This time we have:
rAPP = 2/3, rRATING = 3/3, rINC = 2/2
so we can choose either RATING or INC . Suppose we choose RATING ,
then this yields the rule:
BAL ∧ RATING ⊃ OK
This rule covers only positive instances, so we have learned one rule. To
learn the next one, we eliminate from the sample those rules already cover
by it, and continue like above, until no more positive instances are left with.
(HKUST) Reasoning 57 / 57

You might also like