Logical reasoning 21.1.13

1,377 views

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,377
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
39
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Logical reasoning 21.1.13

  1. 1. Unit IILogical ReasoningV.SaranyaAP/CSESri Vidya College of Engineering andTechnology,Virudhunagar
  2. 2. Logical AgentsKnowledge based AgentsThe Wumpus WorldLogic
  3. 3. February 20, 2006 AI: Chapter 7: Logical Agents 3Logical Agents• Humans can know “things” and “reason”– Representation: How are the things stored?– Reasoning: How is the knowledge used?• To solve a problem…• To generate more knowledge…• Knowledge and reasoning are important to artificial agentsbecause they enable successful behaviors difficult to achieveotherwise– Useful in partially observable environments• Can benefit from knowledge in very general forms,combining and recombining information
  4. 4. February 20, 2006 AI: Chapter 7: Logical Agents 4Knowledge-Based Agents• Central component of a Knowledge-Based Agent isa Knowledge-Base– A set of sentences in a formal language• Sentences are expressed using a knowledge representationlanguage• Two generic functions:– TELL - add new sentences (facts) to the KB• “Tell it what it needs to know”– ASK - query what is known from the KB• “Ask what to do next”
  5. 5. February 20, 2006 AI: Chapter 7: Logical Agents 5Knowledge-Based Agents• The agent must be able to:– Represent states and actions– Incorporate new percepts– Update internalrepresentations of the world– Deduce hidden properties ofthe world– Deduce appropriate actionsInference EngineKnowledge-BaseDomain-IndependentAlgorithmsDomain-SpecificContent
  6. 6. February 20, 2006 AI: Chapter 7: Logical Agents 6Knowledge-Based Agents
  7. 7. February 20, 2006 AI: Chapter 7: Logical Agents 7Knowledge-Based Agents• Declarative– You can build a knowledge-based agent simply by“TELLing” it what it needs to know• Procedural– Encode desired behaviors directly as programcode• Minimizing the role of explicit representation andreasoning can result in a much more efficient system
  8. 8. February 20, 2006 AI: Chapter 7: Logical Agents 8Wumpus World• Performance Measure– Gold +1000, Death – 1000– Step -1, Use arrow -10• Environment– Square adjacent to the Wumpus are smelly– Squares adjacent to the pit are breezy– Glitter iff gold is in the same square– Shooting kills Wumpus if you are facing it– Shooting uses up the only arrow– Grabbing picks up the gold if in the same square– Releasing drops the gold in the same square• Actuators– Left turn, right turn, forward, grab, release,shoot• Sensors– Breeze, glitter, and smell
  9. 9. February 20, 2006 AI: Chapter 7: Logical Agents 9Wumpus World• Characterization of Wumpus World– Observable• partial, only local perception– Deterministic• Yes, outcomes are specified– Episodic• No, sequential at the level of actions– Static• Yes, Wumpus and pits do not move– Discrete• Yes– Single Agent• Yes
  10. 10. February 20, 2006 AI: Chapter 7: Logical Agents 10Wumpus World
  11. 11. February 20, 2006 AI: Chapter 7: Logical Agents 11Wumpus World
  12. 12. February 20, 2006 AI: Chapter 7: Logical Agents 12Wumpus World
  13. 13. February 20, 2006 AI: Chapter 7: Logical Agents 13Wumpus World
  14. 14. February 20, 2006 AI: Chapter 7: Logical Agents 14Wumpus World
  15. 15. February 20, 2006 AI: Chapter 7: Logical Agents 15Wumpus World
  16. 16. February 20, 2006 AI: Chapter 7: Logical Agents 16Wumpus World
  17. 17. February 20, 2006 AI: Chapter 7: Logical Agents 17Wumpus World
  18. 18. LOGICSyntax:– Knowledge base consists of sentences– These sentences are expressed according to thesyntax.Ex: “x + y=4” is a well formed sentence.Semantic:• Meaning of the sentence.• Mention the truth of each sentence.
  19. 19. February 20, 2006 AI: Chapter 7: Logical Agents 19Logic• Entailment means that one thing follows logicallyfrom anothera |= b• a |= b iff in every model in which a is true, b is alsotrue• if a is true, then b must be true• the truth of b is “contained” in the truth of a
  20. 20. Example• “X+Y=4” is true where X is 2– And Y is 2• But false where x is 1 and y is 1.• (every sentence must be true or false in eachpossible world.)
  21. 21. Entailment• Relation of logical element.• A sentence follows logically from anothersentence.• Notation : αl= β– Means sentence a entails(require, need, improve,demand) β– If every word of α is true and β is also true.– (simply if α is true then β must be true)
  22. 22. Logic in Wumpus world• The agent detecting nothing in[1,1]• The agent is interested inwhether the adjacent squares[1,2] & [2,2] and [3,1].• Each of these squares may ormay not contain a pit.• So 23 = 8 possible models.
  23. 23. No pit in[1,2]
  24. 24. • No pit in[2,2]
  25. 25. Conclusion• α1 : there is no pit in [1,2]• α2 : there is no pit in [2,2]• KB is false in any model in which [1,2]contains pit ,because there is no breeze in[1,1]We know if KB is true α1 is also true.Hence KB is not equal to α1in some models, if KB is true α2 is falseKB V α2 so the agent cannot conclude thatthere is no pit in [2,2]
  26. 26. Inference (assumption or conclusion)• The inference algorithm I can be derive fromKB, then•KB |-i a• a is derived from KB by i” (or)• “ i derives a from KB”
  27. 27. Sound• An inference algorithm derives only entailed sentences is called“Sound or Truth Preserving”.• Soundness is highly desirable property.Correspondence between world andrepresentationentailssentences sentenceRepresentationWorldAspects of thereal worldAspects of thereal worldsemanticssemantics
  28. 28. IllustrationSentences are physical configuration of theagent.Reasoning is the process of constructing newphysical configurations from old ones.If any connection between logical reasoningprocess and the real environment in whichthe agent exists is called “grounding”

×