SlideShare a Scribd company logo
1 of 101
Download to read offline
Representation with Logic
THE WUMPUS WORLD
THE WUMPUS WORLD - PEAS
• Performance measure: +1000 for climbing out of the cave with the
gold, –1000 for falling into a pit or being eaten by the wumpus, –1 for
each action taken and –10 for using up the arrow. The game ends
either when the agent dies or when the agent climbs out of the cave.
• Environment:A4×4grid of rooms. The agent always starts in the
square labeled [1,1], facing to the right. The locations of the gold and
the wumpus are chosen randomly, with a uniform distribution, from
the squares other than the start square. In addition, each square
other than the start can be a pit, with probability 0.2.
THE WUMPUS WORLD - PEAS
• Actuators: The agent can move Forward, Turn Left by 90◦ ,Turn Right by 90◦.
• The agent dies a miserable death if it enters a square containing a pit or a live
wumpus. (It is safe, albeit smelly, to enter a square with a dead wumpus.)
• If an agent tries to move forward and bumps into a wall, then the agent does not
move. The action Grab can be used to pick up the gold if it is in the same square
as the agent.
• The action Shoot can be used to fire an arrow in a straight line in the direction the
agent is facing.
• The arrow continues until it either hits (and hence kills) the wumpus or hits a
wall.
• The agent has only one arrow, so only the first Shoot action has any effect.
• Finally, the action Climb can be used to climb out of the cave, but only from
square [1,1].
THE WUMPUS WORLD - PEAS
• Sensors: The agent has five sensors, each of which gives a single bit of
information:
• In the square containing the wumpus and in the directly (not
diagonally) adjacent squares, the agent will perceive a Stench.
• In the squares directly adjacent to a pit, the agent will perceive a
Breeze.
• In the square where the gold is, the agent will perceive a Glitter.
• When an agent walks into a wall, it will perceive a Bump.
• When the wumpus is killed, it emits a woeful Scream that can be
perceived anywhere in the cave.
knowledge-based wumpus agent
knowledge-based wumpus agent
knowledge-based wumpus agent
LOGIC
• SYNTAX
• x+y=4 is a well-formed sentence
• x4y+= is not
• Semantics or meaning of sentences
• The semantics defines the truth of each sentence with respect to each
possible world
• the semantics for arithmetic specifies that the sentence “x+y=4” is true in a
world where x is 2 and y is 2,
• In standard logics, every sentence must be either true or false in each
possible world—there is no “in between
PROPOSITIONAL LOGIC
PROPOSITIONAL LOGIC:A VERY SIMPLE LOGIC
• Syntax:
• The syntax of propositional logic defines the allowable sentences
• The atomic sentences consist of a single proposition symbol
• proposition that can be true or false
• use symbols that start with an uppercase letter and may contain other letters
or subscripts, for example: P, Q, R, W1,3 and North
• Ex: we use W1,3 to stand for the proposition that the wumpus is in [1,3].
• True is the always-true proposition and False is the always-false proposition
• Complex sentences are constructed from simpler sentences, using
parentheses and logical connectives
PROPOSITIONAL LOGIC
• Syntax : There are five connectives in common use:
1. ¬(not). A sentence such as ¬W1,3 is called the negation of W1,3.A literal is
either an atomic sentence (a positive literal) or a negated atomic sentence
(a negative literal).
2. ∧ (and). A sentence whose main connective is ∧, such as W1,3 ∧ P3,1, is called
a conjunction; its parts are the conjuncts.(The ∧ looks like an “A” for “And.”)
3. ∨(or). A sentence using ∨,such as (W1,3∧P3,1)∨W2,2, is a disjunction of the
disjuncts (W1,3∧P3,1) and W2,2. (Historically, the ∨ comes from the Latin “vel,”
which means “or.” For most people, it is easier to remember ∨ as an upside-
down ∧.)
PROPOSITIONAL LOGIC
4. ⇒(implies). A sentence such as (W1,3∧P3,1) ⇒ ¬W2,2 is called an implication (or
conditional). Its premise or antecedent is (W1,3∧P3,1), and its conclusion or
consequent is ¬W2,2. Implications are also known as rules or if–then
statements. The implication symbol is sometimes written in other books as ⊃ or
→
5. ⇔(if and only if). The sentenceW1,3 ⇔¬W2,2 is a biconditional. Some other
books write this as ≡.
PROPOSITIONAL LOGIC
Semantics
• The semantics defines the rules for determining the truth of a
sentence with respect to a particular model
• if the sentences in the knowledge base make use of the proposition
symbols P1,2, P2,2, and P3,1 then one possible model is
• m1={P1,2=false,P2,2=false,P3,1=true}
• With three proposition symbols, there are 23=8 possible models
• truth value—true or false
Semantics
• Ex:
• ¬P is true iff P is false in m.
• P∧Q is true iff both P and Q are true in m.
• P∨Q is true iff either P or Q is true in m.
• P⇒Q is true unless P is true and Q is false in m.
• P⇔Q is true iff P and Q are both true or both false in m.
Semantics
A simple knowledge base
• following symbols for each [x, y] location
• Px,y is true if there is a pit in [x, y]
• Wx,y is true if there is a wumpus in [x, y], dead or alive
• Bx,y is true if the agent perceives a breeze in [x, y]
• Sx,y is true if the agent perceives a stench in [x, y]
• There is no pit in [1,1]:
• R1: ¬P1,1.
A simple knowledge base
• A square is breezy if and only if there is a pit in a neighboring square
• R2: B1,1 ⇔ (P1,2 ∨ P2,1).
• R3: B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1)
• Now we include the breeze percepts for the first two squares
• R4: ¬B1,1.
• R5: B2,1
NOTE
• we use the term model in place of “possible world.”
• possible worlds might be thought of as (potentially) real
environments that the agent might or might not be in.
• models are mathematical abstractions
• Ex:
• Having x men and y women sitting at a table playing bridge, and the sentence
x+y=4 is true when there are four people in total. Formally, the possible
models are just all possible assignments of real numbers to the variables x
and y. Each such assignment fixes the truth of any sentence of arithmetic
whose variables are x and y
NOTE
• If a sentence α is true in model m, we say that m satisfies α or
sometimes m is a model of α
• M(α) to mean the set of all models of α
Entailment
• a sentence follows logically from another sentence
• α |= β
• sentence α entails the sentence β
• formal definition of entailment is:
• α|=β if and only if, in every model in which α is true,β is also true
• α|=β if and only if M(α) ⊆ M(β)
• Ex:
• the sentence x=0 entails the sentence xy=0
Entailment
• Wumpus-world Example: Consider the situation in Figure 7.3(b):
Entailment
• The agent has detected nothing in [1,1] and a breeze in [2,1]
• The agent is interested in whether the adjacent squares [1,2], [2,2],
and [3,1] contain pits
• Each of the three squares might or might not contain a pit, there are
23=8 possible models
Entailment
Entailment
• There are in fact just three models in which the KB is true
• These are shown surrounded by a solid line in Figure 7.5
• Now let us consider two possible conclusions:
• α1=“There is no pit in [1,2].”
• α2=“There is no pit in [2,2].”
• We have surrounded the models of α1 and α2 with dotted lines in
Figures 7.5(a) and 7.5(b), respectively
Entailment
• In every model in which KB is true, α1 is also true
• Hence, KB|=α1 : there is no pit in [1,2]
• in some models in which KB is true, α2 is false
• KB|=α2: the agent cannot conclude that there is no pit in [2,2]
• (Nor can it conclude that there is a pit in [2,2].)
logical equivalence
• The first concept is logical equivalence: two sentences α and β are
logically equivalent if they are true in the same set of models
• An alternative definition of equivalence
• α≡β if and only if α|=β and β|=α
• The second concept we will need is validity
• A sentence is valid if it is true in all models
• Ex: the sentence P ∨ ¬P is valid
• Valid sentences are also known as tautologies
PROPOSITIONAL THEOREM PROVING
• logical equivalence:
PROPOSITIONAL THEOREM
PROVING
Inference and proofs
Inference and proofs
Inference and proofs
Inference and proofs
Inference and proofs
Inference and proofs
Inference and proofs
Inference and proofs
Inference and proofs
• All of the logical equivalences can be used as inference rules
• Ex:
Resolution
• unit resolution
• where each l is a literal and li and m are complementary literals (one
is the negation)
Resolution
• Full resolution rule
• Where li and mj are complementary literals
• Ex:
Conjunctive normal form
• A sentence expressed as a conjunction of clauses is said to be in
conjunctive normal form or CNF
• A procedure for converting to CNF
• Ex: converting the sentence B1,1 ⇔(P1,2∨P2,1) into CNF
• Eliminate ⇔, replacing α⇔β with (α⇒β)∧(β⇒α).
• (B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒B1,1).
• Eliminate⇒, replacing α⇒β with ¬α∨β
• (¬B1,1 ∨ P1,2 ∨ P2,1) ∧ (¬( P1,2 ∨ P2,1) ∨ B1,1)
Conjunctive normal form
• CNF requires ¬ to appear only in literals, so we “move ¬inwards”
• ¬(¬α)≡α (double-negation elimination)
• ¬(α∧β)≡(¬α∨¬β) (De Morgan)
• ¬(α∨β)≡(¬α∧¬β) (De Morgan)
• Ex:
• (¬B1,1 ∨ P1,2 ∨ P2,1)∧((¬P1,2 ∧ ¬P2,1 )∨ B1,1)
• Now we have a sentence containing nested ∧ and ∨ operators applied
to literals. We apply the distributivity law
• (¬B1,1∨P1,2∨P2,1)∧(¬P1,2∨B1,1)∧(¬P2,1∨B1,1)
Definite Clause
• which is a disjunction of literals of which exactly one is positive
• Ex:
• the clause (¬L1,1 ∨ ¬Breeze ∨ B1,1)is a definite clause
• (¬B1,1 ∨ P1,2 ∨ P2,1)is not
• Horn clause
• a disjunction of literals of which at most one is positive.
Definite Clause
• Knowledge bases containing only definite clauses are interesting for
three reasons
• Every definite clause can be written as an implication
• Ex:
the definite clause (¬L1,1 ∨ ¬Breeze ∨ B1,1) can be written as the implication
(L1,1∧Breeze) ⇒B1,1
• it says that if the agent is in [1,1] and there is a breeze, then [1,1] is breezy
• Inference with Horn clauses can be done through the forward-chaining and
backward- chaining algorithms
• Deciding entailment with Horn clauses can be done in time that is linear in
the size of the knowledge base
Forward and backward chaining
• The forward-chaining algorithm
• It begins from known facts (positive literals) in the knowledge base
• If all the premises of an implication are known, then its conclusion is
added to the set of known facts
• For example, if L1,1 and Breeze are known and (L1,1 ∧ Breeze) ⇒ B1,1
is in the knowledge base, then B1,1 can be added
• the main point to remember is that it runs in linear time
The forward-chaining algorithm
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
The backward-chaining algorithm
• The backward-chaining algorithm, as its name suggests, works
backward from the query.
• Backward chaining is a form of goal-directed reasoning
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Propositional Logic Limitations
• No capability for uncertainty
• Can’t talk about objects using properties (Ex: size, weight, color)
• No shortcuts (Ex: no for all)
• Next: solution to last two limitations
FIRST-ORDER LOGIC
FIRST-ORDER LOGIC
FIRST-ORDER LOGIC
• examples of objects, relations, and functions
• Objects: people, houses, numbers, theories, Ronald McDonald, colors,
baseball games, wars, centuries...
• Relations: these can be unary relations or properties such as red, round,
bogus, prime, multistoried..., or more general n-ary relations such as brother
of, bigger than, inside,part of, has color, occurred after, owns, comes
between,...
• Functions: father of, best friend, third inning of, one more than, beginning
of...
SYNTAX AND SEMANTICS OF FIRST-ORDER
LOGIC
• First, they have objects in them
• A model with five objects
Symbols and interpretations
Symbols and interpretations
• Constant symbols - which stand for objects (Richard and John)
• Predicate symbols - which stand for relations
• (Brother, OnHead, Person, King, and Crown)
• function symbols - which stand for functions (LeftLeg)
• Each predicate and function symbol comes with an arity that fixes the
number of arguments.
• We adopt the convention that these symbols will begin with
uppercase letters
Quantifiers
• Universal quantification (∀)
• Existential quantification (∃)
Universal quantification (∀)
• ∀ is usually pronounced “For all ...”. (Remember that the upside-down
A stands for “all.”)
• ∀x King(x) ⇒ Person(x)
• All kings are persons
• For all x,If x is a king, then x is a person.” The symbol x is called a variable
• A term with no variables is called a ground term.
• A common mistake is to use conjunction instead of implication
• ∀x King(x) ∧ Person(x)
• Richard the Lion heart is a king ∧ Richard the Lion heart is a person,
• King John is a king ∧ King John is a person,
• Richard’s left leg is a king ∧ Richard’s left leg is a person
Existential quantification (∃)
• we can make a statement about some object in the universe without
naming it, by using an existential quantifier
• ∃x Crown(x) ∧ OnHead(x,John)
• King John has a crown on his head
• ∃x is pronounced “There exists an x such that...”or“For some x...”
• The sentence ∃xP says that P is true for at least one object x
Nested quantifiers
• “Brothers are siblings” can be written as
• ∀x∀y Brother(x, y) ⇒Sibling(x, y)
• To say that siblinghood is a symmetric relationship
• ∀x, y Sibling(x, y) ⇔Sibling(y,x)
• Everybody loves somebody
• ∀x∃y Loves(x, y)
• There is someone who is loved by everyone
• ∃y ∀xLoves(x, y)
Connections between ∀ and ∃
• ∀x¬Likes(x,Parsnips) is equivalent to ¬∃xLikes(x,Parsnips)
• ∀xLikes(x,IceCream) is equivalent to ¬∃x¬Likes(x,IceCream)
• ∀x¬P ≡¬∃xP ¬(P∨Q) ≡¬P∧¬Q
• ¬∀xP≡∃x¬P ¬(P∧Q) ≡¬P∨¬Q
• ∀xP ≡¬∃x¬P P∧Q ≡¬(¬P∨¬Q)
• ∃xP ≡¬∀x¬P P∨Q ≡¬(¬P∧¬Q)
KNOWLEDGE ENGINEERING IN FIRST-ORDER
LOGIC
• A knowledge engineer is someone who investigates particular
domain, learns what concepts are important in that domain, and
creates a formal representation of the objects and relations in the
domain.
The knowledge-engineering process
1.Identify the task.
The knowledge engineer must delineate the range of questions that the
knowledge base will support and the kinds of facts that will be available for
each specific problem instance. For example, does the wumpus knowledge base
need to be able to choose actions or is it required to answer questions only
about the contents of the environment? Will the sensor facts include the
current location? The task will determine what knowledge must be represented
in order to connect problem instances to answers. This step is analogous to the
PEAS process for designing agents
The knowledge-engineering process
2. Assemble the relevant knowledge.
The knowledge engineer might already be an expert in the domain, or might
need to work with real experts to extract what they know—a process called
knowledge acquisition. At this stage, the knowledge is not represented formally.
The idea is to understand the scope of the knowledge base, as determined by
the task, and to understand how the domain actually works.
The knowledge-engineering process
3. Decide on a vocabulary of predicates, functions, and constants
• That is, translate the important domain-level concepts into logic-level names.
This involves many questions of knowledge-engineering style. Like
programming style, this can have a significant impact on the eventual success
of the project.
4. Encode general knowledge about the domain.
• The knowledge engineer writes down the axioms for all the vocabulary terms.
This pins down (to the extent possible) the meaning of the terms, enabling
the expert to check the content. Often, this step reveals misconceptions or
gaps in the vocabulary that must be fixed by returning to step 3 and iterating
through the process.
The knowledge-engineering process
5. Encode a description of the specific problem instance.
• If the ontology is well thought out, this step will be easy. It will involve writing
simple atomic sentences about instances of concepts that are already part of
the ontology. For a logical agent, problem instances are supplied by the
sensors, whereas a “disembodied” knowledge base is supplied with additional
sentences in the same way that traditional programs are supplied with input
data.
The knowledge-engineering process
6.Pose queries to the inference procedure and get answers.
• This is where the reward is: we can let the inference procedure operate on
the axioms and problem-specific facts to derive the facts we are interested in
knowing. Thus, we avoid the need for writing an application-specific solution
algorithm.
7.Debug the knowledge base.
• Al as, the answers to queries will seldom be correct on the first try. More
precisely, the answers will be correct for the knowledge base as written,
assuming that the inference procedure is sound, but they will not be the ones
that the user is expecting
INFERENCE IN
FIRST-ORDER LOGIC
PROPOSITIONAL VS. FIRST-ORDER INFERENCE
• Suppose our knowledge base contains this,
• ∀x King(x) ∧ Greedy(x) ⇒ Evil(x).
• Then it seems quite permissible to infer any of the following sentences:
• King(John)∧Greedy(John) ⇒Evil(John)
• King(Richard)∧Greedy(Richard) ⇒Evil(Richard)
• King(Father(John))∧Greedy(Father(John)) ⇒Evil(Father(John)).
• …….
PROPOSITIONAL VS. FIRST-ORDER INFERENCE
• The rule of Universal Instantiation(UI for short) says that we can infer
any sentence obtained by substituting a ground term(a term without
variables) for the variable.
• Let SUBST(θ,α) denote the result of applying the substitution θ to the
sentence α. Then the rule is written (for any variable v and ground
term g)
Universal Instantiation
• For example, the three sentences given earlier are obtained with the
substitutions{x/John}, {x/Richard},and{x/Father(John)}.
Existential Instantiation
• In the rule for Existential Instantiation, the variable is replaced by a
single new constant symbol. The formal statement is as follows: for
any sentence α,variable v, and constant symbol k that does not
appear else where in the knowledge base,
Existential Instantiation
• For example, from the sentence
∃x Crown(x) ∧ OnHead(x,John)
we can infer the sentence
Crown(C1) ∧ OnHead (C1,John)
as long as C1 does not appear elsewhere in the knowledge
base.
Ex:
suppose we discover that there is a number that is a little bigger than
2.71828 and that satisfies the equation d(xy)/dy=xy for x.
We can give this number a name, such as e, but it would be a mistake to give
it the name of an existing object, such as π. In logic, the new name is called a
Skolem constant
Unification
• Lifted inference rules require finding substitutions that make different
logical expressions look identical. This process is called unification and
is a key component of all first-order inference algorithms
• UNIFY(Knows(John,x), Knows(John,Jane)) ={x/Jane}
• UNIFY(Knows(John,x), Knows(y,Bill)) ={x/Bill,y/John}
• UNIFY(Knows(John,x), Knows(y,Mother(y))) ={y/John,x/Mother(John)}
• UNIFY(Knows(John,x), Knows(x,Elizabeth)) =fail .
First-order definite clauses
• They are disjunctions of literals of which exactly one is positive
• The following are first-order definite clauses:
• King(x) ∧ Greedy(x) ⇒ Evil(x).
• King(John).
• Greedy(y).
First-order definite clauses
• Ex:
• The law says that it is a crime for an American to sell weapons to hostile
nations. The country Nono, an enemy of America, has some missiles, and all
of its missiles were sold to it by Colonel West, who is American.
• We will prove that West is a criminal
• First, we will represent these facts as first-order definite clauses.
First-order definite clauses
• “...it is a crime for an American to sell weapons to hostile nations”:
• American(x)∧Weapon(y)∧Sells(x, y, z)∧Hostile(z) ⇒Criminal(x).
• “Nono...has some missiles.”
• The sentence ∃xOwns(Nono,x)∧Missile(x) is transformed into two definite clauses by
Existential Instantiation, introducing a new constant M1:
• Owns (Nono,M1)
• Missile(M1)
• “All of its missiles were sold to it by Colonel West”:
• Missile(x)∧Owns(Nono,x) ⇒Sells(West,x,Nono).
• We will also need to know that missiles are weapons:
• Missile(x)⇒Weapon(x)
First-order definite clauses
• An enemy of America counts as “hostile”:
• Enemy(x,America) ⇒Hostile(x).
• “West, who is American...”:
• American(West).
• “The country Nono, an enemy of America...”:
• Enemy(Nono,America)
A simple forward-chaining algorithm
A backward-chaining algorithm
Conjunctive normal form for first-order logic
• Ex:
• “Everyone who loves all animals is loved by someone,” or
• ∀x[∀y Animal(y) ⇒Loves(x, y)] ⇒[∃y Loves(y,x)].
• Eliminate implications:
• ∀x[¬∀y ¬Animal(y)∨Loves(x, y)]∨[∃y Loves(y,x)]
• Move ¬ inwards
• ¬∀xp becomes ∃x¬p
• ¬∃xp becomes ∀x¬p.
• ∀x[∃y ¬(¬Animal(y)∨Loves(x, y))]∨[∃y Loves(y,x)].
• ∀x[∃y ¬¬Animal(y)∧¬Loves(x, y)]∨[∃y Loves(y,x)].
• ∀x[∃y Animal(y)∧¬Loves(x, y)]∨[∃y Loves(y,x)].
Conjunctive normal form for first-order logic
• Standardize variables: For sentences like (∃xP(x))∨(∃xQ(x)) which use
the same variable name twice, change the name of one of the
variables
• ∀x[∃y Animal(y)∧¬Loves(x, y)]∨[∃z Loves(z,x)]
• Skolemize: Skolemizationis the process of removing existential
quantifiers by elimination.
• ∀x[Animal(A)∧¬Loves(x, A)]∨Loves(B,x) – wrong idea
• ∀x[Animal(F(x))∧¬Loves(x, F(x))]∨Loves(G(z),x)
Conjunctive normal form for first-order logic
• Drop universal quantifiers:
• [Animal(F(x))∧¬Loves(x, F(x))]∨Loves(G(z),x).
• Distribute ∨ over ∧:
• [Animal(F(x))∨Loves(G(z),x)]∧[¬Loves(x, F(x))∨Loves(G(z),x)]
Summary
• REPRESENTATION WITH LOGIC
• PROPOSITIONAL LOGIC
• PROPOSITIONAL THEOREM PROVING
• FORWARD AND BACKWARD CHAINING
• FIRST-ORDER LOGIC
• INFERENCE IN FIRST-ORDER LOGIC

More Related Content

What's hot

2010 3-24 cryptography stamatiou
2010 3-24 cryptography stamatiou2010 3-24 cryptography stamatiou
2010 3-24 cryptography stamatiou
vafopoulos
 
Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inference
asimnawaz54
 

What's hot (13)

3 fol examples v2
3 fol examples v23 fol examples v2
3 fol examples v2
 
Math
MathMath
Math
 
Sequence series
Sequence  seriesSequence  series
Sequence series
 
Linear Temporal Logic LTL
Linear Temporal Logic LTLLinear Temporal Logic LTL
Linear Temporal Logic LTL
 
aem : Fourier series of Even and Odd Function
aem :  Fourier series of Even and Odd Functionaem :  Fourier series of Even and Odd Function
aem : Fourier series of Even and Odd Function
 
App a
App aApp a
App a
 
2010 3-24 cryptography stamatiou
2010 3-24 cryptography stamatiou2010 3-24 cryptography stamatiou
2010 3-24 cryptography stamatiou
 
Predicates and Quantifiers
Predicates and QuantifiersPredicates and Quantifiers
Predicates and Quantifiers
 
Fourier series of odd functions with period 2 l
Fourier series of odd functions with period 2 lFourier series of odd functions with period 2 l
Fourier series of odd functions with period 2 l
 
Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inference
 
Complex analysis book by iit
Complex analysis book by iitComplex analysis book by iit
Complex analysis book by iit
 
Predicates and Quantifiers
Predicates and Quantifiers Predicates and Quantifiers
Predicates and Quantifiers
 
1531 fourier series- integrals and trans
1531 fourier series- integrals and trans1531 fourier series- integrals and trans
1531 fourier series- integrals and trans
 

Similar to Lecture 4 representation with logic

Convex Partitioning of a Polygon into Smaller Number of Pieces with Lowest Me...
Convex Partitioning of a Polygon into Smaller Number of Pieces with Lowest Me...Convex Partitioning of a Polygon into Smaller Number of Pieces with Lowest Me...
Convex Partitioning of a Polygon into Smaller Number of Pieces with Lowest Me...
Kasun Ranga Wijeweera
 
Propositional logic & inference
Propositional logic & inferencePropositional logic & inference
Propositional logic & inference
Slideshare
 
Introduction iii
Introduction iiiIntroduction iii
Introduction iii
chandsek666
 

Similar to Lecture 4 representation with logic (20)

Logic agent
Logic agentLogic agent
Logic agent
 
Knowledge based agent
Knowledge based agentKnowledge based agent
Knowledge based agent
 
Logic
LogicLogic
Logic
 
Obj. 27 Special Parallelograms
Obj. 27 Special ParallelogramsObj. 27 Special Parallelograms
Obj. 27 Special Parallelograms
 
Convex Partitioning of a Polygon into Smaller Number of Pieces with Lowest Me...
Convex Partitioning of a Polygon into Smaller Number of Pieces with Lowest Me...Convex Partitioning of a Polygon into Smaller Number of Pieces with Lowest Me...
Convex Partitioning of a Polygon into Smaller Number of Pieces with Lowest Me...
 
Propositional logic & inference
Propositional logic & inferencePropositional logic & inference
Propositional logic & inference
 
Tma2033 chap1.1&1.2handout
Tma2033 chap1.1&1.2handoutTma2033 chap1.1&1.2handout
Tma2033 chap1.1&1.2handout
 
Logic.ppt
Logic.pptLogic.ppt
Logic.ppt
 
AI-Unit4.ppt
AI-Unit4.pptAI-Unit4.ppt
AI-Unit4.ppt
 
Finite mathematics
Finite mathematicsFinite mathematics
Finite mathematics
 
Discrete Math Chapter 1 :The Foundations: Logic and Proofs
Discrete Math Chapter 1 :The Foundations: Logic and ProofsDiscrete Math Chapter 1 :The Foundations: Logic and Proofs
Discrete Math Chapter 1 :The Foundations: Logic and Proofs
 
Introduction iii
Introduction iiiIntroduction iii
Introduction iii
 
AP Advantage: AP Calculus
AP Advantage: AP CalculusAP Advantage: AP Calculus
AP Advantage: AP Calculus
 
Lec12-Probability (1).ppt
Lec12-Probability (1).pptLec12-Probability (1).ppt
Lec12-Probability (1).ppt
 
Lec12-Probability.ppt
Lec12-Probability.pptLec12-Probability.ppt
Lec12-Probability.ppt
 
Lec12-Probability.ppt
Lec12-Probability.pptLec12-Probability.ppt
Lec12-Probability.ppt
 
Lec12-Probability.ppt
Lec12-Probability.pptLec12-Probability.ppt
Lec12-Probability.ppt
 
Unit III Knowledge Representation in AI K.Sundar,AP/CSE,VEC
Unit III  Knowledge Representation in AI   K.Sundar,AP/CSE,VECUnit III  Knowledge Representation in AI   K.Sundar,AP/CSE,VEC
Unit III Knowledge Representation in AI K.Sundar,AP/CSE,VEC
 
Machine learning mathematicals.pdf
Machine learning mathematicals.pdfMachine learning mathematicals.pdf
Machine learning mathematicals.pdf
 
module5_backtrackingnbranchnbound_2022.pdf
module5_backtrackingnbranchnbound_2022.pdfmodule5_backtrackingnbranchnbound_2022.pdf
module5_backtrackingnbranchnbound_2022.pdf
 

More from Vajira Thambawita

More from Vajira Thambawita (20)

Lecture 4 principles of parallel algorithm design updated
Lecture 4   principles of parallel algorithm design updatedLecture 4   principles of parallel algorithm design updated
Lecture 4 principles of parallel algorithm design updated
 
Lecture 3 parallel programming platforms
Lecture 3   parallel programming platformsLecture 3   parallel programming platforms
Lecture 3 parallel programming platforms
 
Lecture 2 more about parallel computing
Lecture 2   more about parallel computingLecture 2   more about parallel computing
Lecture 2 more about parallel computing
 
Lecture 1 introduction to parallel and distributed computing
Lecture 1   introduction to parallel and distributed computingLecture 1   introduction to parallel and distributed computing
Lecture 1 introduction to parallel and distributed computing
 
Lecture 12 localization and navigation
Lecture 12 localization and navigationLecture 12 localization and navigation
Lecture 12 localization and navigation
 
Lecture 11 neural network principles
Lecture 11 neural network principlesLecture 11 neural network principles
Lecture 11 neural network principles
 
Lecture 10 mobile robot design
Lecture 10 mobile robot designLecture 10 mobile robot design
Lecture 10 mobile robot design
 
Lecture 09 control
Lecture 09 controlLecture 09 control
Lecture 09 control
 
Lecture 08 robots and controllers
Lecture 08 robots and controllersLecture 08 robots and controllers
Lecture 08 robots and controllers
 
Lecture 07 more about pic
Lecture 07 more about picLecture 07 more about pic
Lecture 07 more about pic
 
Lecture 06 pic programming in c
Lecture 06 pic programming in cLecture 06 pic programming in c
Lecture 06 pic programming in c
 
Lecture 05 pic io port programming
Lecture 05 pic io port programmingLecture 05 pic io port programming
Lecture 05 pic io port programming
 
Lecture 04 branch call and time delay
Lecture 04  branch call and time delayLecture 04  branch call and time delay
Lecture 04 branch call and time delay
 
Lecture 03 basics of pic
Lecture 03 basics of picLecture 03 basics of pic
Lecture 03 basics of pic
 
Lecture 02 mechatronics systems
Lecture 02 mechatronics systemsLecture 02 mechatronics systems
Lecture 02 mechatronics systems
 
Lecture 1 - Introduction to embedded system and Robotics
Lecture 1 - Introduction to embedded system and RoboticsLecture 1 - Introduction to embedded system and Robotics
Lecture 1 - Introduction to embedded system and Robotics
 
Lec 09 - Registers and Counters
Lec 09 - Registers and CountersLec 09 - Registers and Counters
Lec 09 - Registers and Counters
 
Lec 08 - DESIGN PROCEDURE
Lec 08 - DESIGN PROCEDURELec 08 - DESIGN PROCEDURE
Lec 08 - DESIGN PROCEDURE
 
Lec 07 - ANALYSIS OF CLOCKED SEQUENTIAL CIRCUITS
Lec 07 - ANALYSIS OF CLOCKED SEQUENTIAL CIRCUITSLec 07 - ANALYSIS OF CLOCKED SEQUENTIAL CIRCUITS
Lec 07 - ANALYSIS OF CLOCKED SEQUENTIAL CIRCUITS
 
Lec 06 - Synchronous Sequential Logic
Lec 06 - Synchronous Sequential LogicLec 06 - Synchronous Sequential Logic
Lec 06 - Synchronous Sequential Logic
 

Recently uploaded

The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 

Recently uploaded (20)

Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 

Lecture 4 representation with logic

  • 3. THE WUMPUS WORLD - PEAS • Performance measure: +1000 for climbing out of the cave with the gold, –1000 for falling into a pit or being eaten by the wumpus, –1 for each action taken and –10 for using up the arrow. The game ends either when the agent dies or when the agent climbs out of the cave. • Environment:A4×4grid of rooms. The agent always starts in the square labeled [1,1], facing to the right. The locations of the gold and the wumpus are chosen randomly, with a uniform distribution, from the squares other than the start square. In addition, each square other than the start can be a pit, with probability 0.2.
  • 4. THE WUMPUS WORLD - PEAS • Actuators: The agent can move Forward, Turn Left by 90◦ ,Turn Right by 90◦. • The agent dies a miserable death if it enters a square containing a pit or a live wumpus. (It is safe, albeit smelly, to enter a square with a dead wumpus.) • If an agent tries to move forward and bumps into a wall, then the agent does not move. The action Grab can be used to pick up the gold if it is in the same square as the agent. • The action Shoot can be used to fire an arrow in a straight line in the direction the agent is facing. • The arrow continues until it either hits (and hence kills) the wumpus or hits a wall. • The agent has only one arrow, so only the first Shoot action has any effect. • Finally, the action Climb can be used to climb out of the cave, but only from square [1,1].
  • 5. THE WUMPUS WORLD - PEAS • Sensors: The agent has five sensors, each of which gives a single bit of information: • In the square containing the wumpus and in the directly (not diagonally) adjacent squares, the agent will perceive a Stench. • In the squares directly adjacent to a pit, the agent will perceive a Breeze. • In the square where the gold is, the agent will perceive a Glitter. • When an agent walks into a wall, it will perceive a Bump. • When the wumpus is killed, it emits a woeful Scream that can be perceived anywhere in the cave.
  • 9. LOGIC • SYNTAX • x+y=4 is a well-formed sentence • x4y+= is not • Semantics or meaning of sentences • The semantics defines the truth of each sentence with respect to each possible world • the semantics for arithmetic specifies that the sentence “x+y=4” is true in a world where x is 2 and y is 2, • In standard logics, every sentence must be either true or false in each possible world—there is no “in between
  • 11. PROPOSITIONAL LOGIC:A VERY SIMPLE LOGIC • Syntax: • The syntax of propositional logic defines the allowable sentences • The atomic sentences consist of a single proposition symbol • proposition that can be true or false • use symbols that start with an uppercase letter and may contain other letters or subscripts, for example: P, Q, R, W1,3 and North • Ex: we use W1,3 to stand for the proposition that the wumpus is in [1,3]. • True is the always-true proposition and False is the always-false proposition • Complex sentences are constructed from simpler sentences, using parentheses and logical connectives
  • 12. PROPOSITIONAL LOGIC • Syntax : There are five connectives in common use: 1. ¬(not). A sentence such as ¬W1,3 is called the negation of W1,3.A literal is either an atomic sentence (a positive literal) or a negated atomic sentence (a negative literal). 2. ∧ (and). A sentence whose main connective is ∧, such as W1,3 ∧ P3,1, is called a conjunction; its parts are the conjuncts.(The ∧ looks like an “A” for “And.”) 3. ∨(or). A sentence using ∨,such as (W1,3∧P3,1)∨W2,2, is a disjunction of the disjuncts (W1,3∧P3,1) and W2,2. (Historically, the ∨ comes from the Latin “vel,” which means “or.” For most people, it is easier to remember ∨ as an upside- down ∧.)
  • 13. PROPOSITIONAL LOGIC 4. ⇒(implies). A sentence such as (W1,3∧P3,1) ⇒ ¬W2,2 is called an implication (or conditional). Its premise or antecedent is (W1,3∧P3,1), and its conclusion or consequent is ¬W2,2. Implications are also known as rules or if–then statements. The implication symbol is sometimes written in other books as ⊃ or → 5. ⇔(if and only if). The sentenceW1,3 ⇔¬W2,2 is a biconditional. Some other books write this as ≡.
  • 15. Semantics • The semantics defines the rules for determining the truth of a sentence with respect to a particular model • if the sentences in the knowledge base make use of the proposition symbols P1,2, P2,2, and P3,1 then one possible model is • m1={P1,2=false,P2,2=false,P3,1=true} • With three proposition symbols, there are 23=8 possible models • truth value—true or false
  • 16. Semantics • Ex: • ¬P is true iff P is false in m. • P∧Q is true iff both P and Q are true in m. • P∨Q is true iff either P or Q is true in m. • P⇒Q is true unless P is true and Q is false in m. • P⇔Q is true iff P and Q are both true or both false in m.
  • 18. A simple knowledge base • following symbols for each [x, y] location • Px,y is true if there is a pit in [x, y] • Wx,y is true if there is a wumpus in [x, y], dead or alive • Bx,y is true if the agent perceives a breeze in [x, y] • Sx,y is true if the agent perceives a stench in [x, y] • There is no pit in [1,1]: • R1: ¬P1,1.
  • 19. A simple knowledge base • A square is breezy if and only if there is a pit in a neighboring square • R2: B1,1 ⇔ (P1,2 ∨ P2,1). • R3: B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1) • Now we include the breeze percepts for the first two squares • R4: ¬B1,1. • R5: B2,1
  • 20. NOTE • we use the term model in place of “possible world.” • possible worlds might be thought of as (potentially) real environments that the agent might or might not be in. • models are mathematical abstractions • Ex: • Having x men and y women sitting at a table playing bridge, and the sentence x+y=4 is true when there are four people in total. Formally, the possible models are just all possible assignments of real numbers to the variables x and y. Each such assignment fixes the truth of any sentence of arithmetic whose variables are x and y
  • 21. NOTE • If a sentence α is true in model m, we say that m satisfies α or sometimes m is a model of α • M(α) to mean the set of all models of α
  • 22. Entailment • a sentence follows logically from another sentence • α |= β • sentence α entails the sentence β • formal definition of entailment is: • α|=β if and only if, in every model in which α is true,β is also true • α|=β if and only if M(α) ⊆ M(β) • Ex: • the sentence x=0 entails the sentence xy=0
  • 23. Entailment • Wumpus-world Example: Consider the situation in Figure 7.3(b):
  • 24. Entailment • The agent has detected nothing in [1,1] and a breeze in [2,1] • The agent is interested in whether the adjacent squares [1,2], [2,2], and [3,1] contain pits • Each of the three squares might or might not contain a pit, there are 23=8 possible models
  • 26. Entailment • There are in fact just three models in which the KB is true • These are shown surrounded by a solid line in Figure 7.5 • Now let us consider two possible conclusions: • α1=“There is no pit in [1,2].” • α2=“There is no pit in [2,2].” • We have surrounded the models of α1 and α2 with dotted lines in Figures 7.5(a) and 7.5(b), respectively
  • 27. Entailment • In every model in which KB is true, α1 is also true • Hence, KB|=α1 : there is no pit in [1,2] • in some models in which KB is true, α2 is false • KB|=α2: the agent cannot conclude that there is no pit in [2,2] • (Nor can it conclude that there is a pit in [2,2].)
  • 28. logical equivalence • The first concept is logical equivalence: two sentences α and β are logically equivalent if they are true in the same set of models • An alternative definition of equivalence • α≡β if and only if α|=β and β|=α • The second concept we will need is validity • A sentence is valid if it is true in all models • Ex: the sentence P ∨ ¬P is valid • Valid sentences are also known as tautologies
  • 29. PROPOSITIONAL THEOREM PROVING • logical equivalence:
  • 39. Inference and proofs • All of the logical equivalences can be used as inference rules • Ex:
  • 40. Resolution • unit resolution • where each l is a literal and li and m are complementary literals (one is the negation)
  • 41. Resolution • Full resolution rule • Where li and mj are complementary literals • Ex:
  • 42. Conjunctive normal form • A sentence expressed as a conjunction of clauses is said to be in conjunctive normal form or CNF • A procedure for converting to CNF • Ex: converting the sentence B1,1 ⇔(P1,2∨P2,1) into CNF • Eliminate ⇔, replacing α⇔β with (α⇒β)∧(β⇒α). • (B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒B1,1). • Eliminate⇒, replacing α⇒β with ¬α∨β • (¬B1,1 ∨ P1,2 ∨ P2,1) ∧ (¬( P1,2 ∨ P2,1) ∨ B1,1)
  • 43. Conjunctive normal form • CNF requires ¬ to appear only in literals, so we “move ¬inwards” • ¬(¬α)≡α (double-negation elimination) • ¬(α∧β)≡(¬α∨¬β) (De Morgan) • ¬(α∨β)≡(¬α∧¬β) (De Morgan) • Ex: • (¬B1,1 ∨ P1,2 ∨ P2,1)∧((¬P1,2 ∧ ¬P2,1 )∨ B1,1) • Now we have a sentence containing nested ∧ and ∨ operators applied to literals. We apply the distributivity law • (¬B1,1∨P1,2∨P2,1)∧(¬P1,2∨B1,1)∧(¬P2,1∨B1,1)
  • 44. Definite Clause • which is a disjunction of literals of which exactly one is positive • Ex: • the clause (¬L1,1 ∨ ¬Breeze ∨ B1,1)is a definite clause • (¬B1,1 ∨ P1,2 ∨ P2,1)is not • Horn clause • a disjunction of literals of which at most one is positive.
  • 45. Definite Clause • Knowledge bases containing only definite clauses are interesting for three reasons • Every definite clause can be written as an implication • Ex: the definite clause (¬L1,1 ∨ ¬Breeze ∨ B1,1) can be written as the implication (L1,1∧Breeze) ⇒B1,1 • it says that if the agent is in [1,1] and there is a breeze, then [1,1] is breezy • Inference with Horn clauses can be done through the forward-chaining and backward- chaining algorithms • Deciding entailment with Horn clauses can be done in time that is linear in the size of the knowledge base
  • 46. Forward and backward chaining • The forward-chaining algorithm • It begins from known facts (positive literals) in the knowledge base • If all the premises of an implication are known, then its conclusion is added to the set of known facts • For example, if L1,1 and Breeze are known and (L1,1 ∧ Breeze) ⇒ B1,1 is in the knowledge base, then B1,1 can be added • the main point to remember is that it runs in linear time
  • 56. The backward-chaining algorithm • The backward-chaining algorithm, as its name suggests, works backward from the query. • Backward chaining is a form of goal-directed reasoning
  • 67. Propositional Logic Limitations • No capability for uncertainty • Can’t talk about objects using properties (Ex: size, weight, color) • No shortcuts (Ex: no for all) • Next: solution to last two limitations
  • 70. FIRST-ORDER LOGIC • examples of objects, relations, and functions • Objects: people, houses, numbers, theories, Ronald McDonald, colors, baseball games, wars, centuries... • Relations: these can be unary relations or properties such as red, round, bogus, prime, multistoried..., or more general n-ary relations such as brother of, bigger than, inside,part of, has color, occurred after, owns, comes between,... • Functions: father of, best friend, third inning of, one more than, beginning of...
  • 71. SYNTAX AND SEMANTICS OF FIRST-ORDER LOGIC • First, they have objects in them • A model with five objects
  • 73. Symbols and interpretations • Constant symbols - which stand for objects (Richard and John) • Predicate symbols - which stand for relations • (Brother, OnHead, Person, King, and Crown) • function symbols - which stand for functions (LeftLeg) • Each predicate and function symbol comes with an arity that fixes the number of arguments. • We adopt the convention that these symbols will begin with uppercase letters
  • 74. Quantifiers • Universal quantification (∀) • Existential quantification (∃)
  • 75. Universal quantification (∀) • ∀ is usually pronounced “For all ...”. (Remember that the upside-down A stands for “all.”) • ∀x King(x) ⇒ Person(x) • All kings are persons • For all x,If x is a king, then x is a person.” The symbol x is called a variable • A term with no variables is called a ground term. • A common mistake is to use conjunction instead of implication • ∀x King(x) ∧ Person(x) • Richard the Lion heart is a king ∧ Richard the Lion heart is a person, • King John is a king ∧ King John is a person, • Richard’s left leg is a king ∧ Richard’s left leg is a person
  • 76. Existential quantification (∃) • we can make a statement about some object in the universe without naming it, by using an existential quantifier • ∃x Crown(x) ∧ OnHead(x,John) • King John has a crown on his head • ∃x is pronounced “There exists an x such that...”or“For some x...” • The sentence ∃xP says that P is true for at least one object x
  • 77. Nested quantifiers • “Brothers are siblings” can be written as • ∀x∀y Brother(x, y) ⇒Sibling(x, y) • To say that siblinghood is a symmetric relationship • ∀x, y Sibling(x, y) ⇔Sibling(y,x) • Everybody loves somebody • ∀x∃y Loves(x, y) • There is someone who is loved by everyone • ∃y ∀xLoves(x, y)
  • 78. Connections between ∀ and ∃ • ∀x¬Likes(x,Parsnips) is equivalent to ¬∃xLikes(x,Parsnips) • ∀xLikes(x,IceCream) is equivalent to ¬∃x¬Likes(x,IceCream) • ∀x¬P ≡¬∃xP ¬(P∨Q) ≡¬P∧¬Q • ¬∀xP≡∃x¬P ¬(P∧Q) ≡¬P∨¬Q • ∀xP ≡¬∃x¬P P∧Q ≡¬(¬P∨¬Q) • ∃xP ≡¬∀x¬P P∨Q ≡¬(¬P∧¬Q)
  • 79. KNOWLEDGE ENGINEERING IN FIRST-ORDER LOGIC • A knowledge engineer is someone who investigates particular domain, learns what concepts are important in that domain, and creates a formal representation of the objects and relations in the domain.
  • 80. The knowledge-engineering process 1.Identify the task. The knowledge engineer must delineate the range of questions that the knowledge base will support and the kinds of facts that will be available for each specific problem instance. For example, does the wumpus knowledge base need to be able to choose actions or is it required to answer questions only about the contents of the environment? Will the sensor facts include the current location? The task will determine what knowledge must be represented in order to connect problem instances to answers. This step is analogous to the PEAS process for designing agents
  • 81. The knowledge-engineering process 2. Assemble the relevant knowledge. The knowledge engineer might already be an expert in the domain, or might need to work with real experts to extract what they know—a process called knowledge acquisition. At this stage, the knowledge is not represented formally. The idea is to understand the scope of the knowledge base, as determined by the task, and to understand how the domain actually works.
  • 82. The knowledge-engineering process 3. Decide on a vocabulary of predicates, functions, and constants • That is, translate the important domain-level concepts into logic-level names. This involves many questions of knowledge-engineering style. Like programming style, this can have a significant impact on the eventual success of the project. 4. Encode general knowledge about the domain. • The knowledge engineer writes down the axioms for all the vocabulary terms. This pins down (to the extent possible) the meaning of the terms, enabling the expert to check the content. Often, this step reveals misconceptions or gaps in the vocabulary that must be fixed by returning to step 3 and iterating through the process.
  • 83. The knowledge-engineering process 5. Encode a description of the specific problem instance. • If the ontology is well thought out, this step will be easy. It will involve writing simple atomic sentences about instances of concepts that are already part of the ontology. For a logical agent, problem instances are supplied by the sensors, whereas a “disembodied” knowledge base is supplied with additional sentences in the same way that traditional programs are supplied with input data.
  • 84. The knowledge-engineering process 6.Pose queries to the inference procedure and get answers. • This is where the reward is: we can let the inference procedure operate on the axioms and problem-specific facts to derive the facts we are interested in knowing. Thus, we avoid the need for writing an application-specific solution algorithm. 7.Debug the knowledge base. • Al as, the answers to queries will seldom be correct on the first try. More precisely, the answers will be correct for the knowledge base as written, assuming that the inference procedure is sound, but they will not be the ones that the user is expecting
  • 86. PROPOSITIONAL VS. FIRST-ORDER INFERENCE • Suppose our knowledge base contains this, • ∀x King(x) ∧ Greedy(x) ⇒ Evil(x). • Then it seems quite permissible to infer any of the following sentences: • King(John)∧Greedy(John) ⇒Evil(John) • King(Richard)∧Greedy(Richard) ⇒Evil(Richard) • King(Father(John))∧Greedy(Father(John)) ⇒Evil(Father(John)). • …….
  • 87. PROPOSITIONAL VS. FIRST-ORDER INFERENCE • The rule of Universal Instantiation(UI for short) says that we can infer any sentence obtained by substituting a ground term(a term without variables) for the variable. • Let SUBST(θ,α) denote the result of applying the substitution θ to the sentence α. Then the rule is written (for any variable v and ground term g)
  • 88. Universal Instantiation • For example, the three sentences given earlier are obtained with the substitutions{x/John}, {x/Richard},and{x/Father(John)}.
  • 89. Existential Instantiation • In the rule for Existential Instantiation, the variable is replaced by a single new constant symbol. The formal statement is as follows: for any sentence α,variable v, and constant symbol k that does not appear else where in the knowledge base,
  • 90. Existential Instantiation • For example, from the sentence ∃x Crown(x) ∧ OnHead(x,John) we can infer the sentence Crown(C1) ∧ OnHead (C1,John) as long as C1 does not appear elsewhere in the knowledge base. Ex: suppose we discover that there is a number that is a little bigger than 2.71828 and that satisfies the equation d(xy)/dy=xy for x. We can give this number a name, such as e, but it would be a mistake to give it the name of an existing object, such as π. In logic, the new name is called a Skolem constant
  • 91. Unification • Lifted inference rules require finding substitutions that make different logical expressions look identical. This process is called unification and is a key component of all first-order inference algorithms • UNIFY(Knows(John,x), Knows(John,Jane)) ={x/Jane} • UNIFY(Knows(John,x), Knows(y,Bill)) ={x/Bill,y/John} • UNIFY(Knows(John,x), Knows(y,Mother(y))) ={y/John,x/Mother(John)} • UNIFY(Knows(John,x), Knows(x,Elizabeth)) =fail .
  • 92. First-order definite clauses • They are disjunctions of literals of which exactly one is positive • The following are first-order definite clauses: • King(x) ∧ Greedy(x) ⇒ Evil(x). • King(John). • Greedy(y).
  • 93. First-order definite clauses • Ex: • The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American. • We will prove that West is a criminal • First, we will represent these facts as first-order definite clauses.
  • 94. First-order definite clauses • “...it is a crime for an American to sell weapons to hostile nations”: • American(x)∧Weapon(y)∧Sells(x, y, z)∧Hostile(z) ⇒Criminal(x). • “Nono...has some missiles.” • The sentence ∃xOwns(Nono,x)∧Missile(x) is transformed into two definite clauses by Existential Instantiation, introducing a new constant M1: • Owns (Nono,M1) • Missile(M1) • “All of its missiles were sold to it by Colonel West”: • Missile(x)∧Owns(Nono,x) ⇒Sells(West,x,Nono). • We will also need to know that missiles are weapons: • Missile(x)⇒Weapon(x)
  • 95. First-order definite clauses • An enemy of America counts as “hostile”: • Enemy(x,America) ⇒Hostile(x). • “West, who is American...”: • American(West). • “The country Nono, an enemy of America...”: • Enemy(Nono,America)
  • 98. Conjunctive normal form for first-order logic • Ex: • “Everyone who loves all animals is loved by someone,” or • ∀x[∀y Animal(y) ⇒Loves(x, y)] ⇒[∃y Loves(y,x)]. • Eliminate implications: • ∀x[¬∀y ¬Animal(y)∨Loves(x, y)]∨[∃y Loves(y,x)] • Move ¬ inwards • ¬∀xp becomes ∃x¬p • ¬∃xp becomes ∀x¬p. • ∀x[∃y ¬(¬Animal(y)∨Loves(x, y))]∨[∃y Loves(y,x)]. • ∀x[∃y ¬¬Animal(y)∧¬Loves(x, y)]∨[∃y Loves(y,x)]. • ∀x[∃y Animal(y)∧¬Loves(x, y)]∨[∃y Loves(y,x)].
  • 99. Conjunctive normal form for first-order logic • Standardize variables: For sentences like (∃xP(x))∨(∃xQ(x)) which use the same variable name twice, change the name of one of the variables • ∀x[∃y Animal(y)∧¬Loves(x, y)]∨[∃z Loves(z,x)] • Skolemize: Skolemizationis the process of removing existential quantifiers by elimination. • ∀x[Animal(A)∧¬Loves(x, A)]∨Loves(B,x) – wrong idea • ∀x[Animal(F(x))∧¬Loves(x, F(x))]∨Loves(G(z),x)
  • 100. Conjunctive normal form for first-order logic • Drop universal quantifiers: • [Animal(F(x))∧¬Loves(x, F(x))]∨Loves(G(z),x). • Distribute ∨ over ∧: • [Animal(F(x))∨Loves(G(z),x)]∧[¬Loves(x, F(x))∨Loves(G(z),x)]
  • 101. Summary • REPRESENTATION WITH LOGIC • PROPOSITIONAL LOGIC • PROPOSITIONAL THEOREM PROVING • FORWARD AND BACKWARD CHAINING • FIRST-ORDER LOGIC • INFERENCE IN FIRST-ORDER LOGIC