• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Artificial intelligence and knowledge representation
 

Artificial intelligence and knowledge representation

on

  • 3,318 views

 

Statistics

Views

Total Views
3,318
Views on SlideShare
3,318
Embed Views
0

Actions

Likes
0
Downloads
39
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Artificial intelligence and knowledge representation Artificial intelligence and knowledge representation Presentation Transcript

    • ARTIFICIAL INTELLIGENCE AND KNOWLEDGE REPRESENTATION
    • WHAT MAKES THE COMPUTER INTELLIGENT?
      • SPEED OF COMPUTATION
      • FILTERS OUT AND DISPLAYS ONLY MEANINGFUL RESPONSES OR SOLUTIONS TO A SPECIFIC QUESTION
      • ALGORITHMS SPLITS TASK INTO SUBTASKS – RECURSION
      • NEURAL NETWORKS.
    • WHY ARTIFICIAL INTELLIGENCE
      • UNLIKE HUMANS, COMPUTERS HAVE TROUBLE UNDERSTANDING SPECIFIC SITUATIONS, AND ADAPTING TO NEW SITUATIONS.
      • ARTIFICIAL INTELLIGENCE IMPROVES MACHINE BEHAVIOR IN TACKLING SUCH COMPLEX TASKS, BASED ON ABSTRACT THOUGHT, HIGH-LEVEL DELIBERATIVE REASONING AND PATTERN RECOGNITION.
      • ARTIFICIAL INTELLIGENCE CAN HELP US UNDERSTAND THIS PROCESS BY RECREATING IT, THEN POTENTIALLY ENABLING US TO ENHANCE IT BEYOND OUR CURRENT CAPABILITIES
    • KNOWLEDGE REPRESENTATION? EXAMPLE: -CANNIBAL-MISSIONARY PROBLEM THREE MISSIONARIES AND THREE CANNIBALS COME TO A RIVER AND FIND A BOAT THAT HOLDS TWO. IF THE CANNIBALS EVER OUTNUMBER THE MISSIONARIES ON EITHER BANK, THE MISSIONARIES WILL BE EATEN. HOW SHALL THEY CROSS? HERE COMES THE IMPORTANCE OF KNOWLEDGE. THIS PROBLEM CAN ALTHOUGH BE SOLVED BY INTELLIGENT ALGORITHMS BUT KNOWLEDGE PLAYS THE MOST CRUCIAL PART
    • NEED FOR FORMAL LANGUAGES CONSIDER AN ENGLISH SENTENCE LIKE: “ THE BOY SAW A GIRL WITH A TELESCOPE” NATURAL LANGUAGES EXHIBIT AMBIGUITY NOT ONLY DOES AMBIGUITY MAKE IT DIFFICULT FOR US TO UNDERSTAND WHAT IS THE INTENDED MEANING OF CERTAIN PHRASES AND SENTENCES BUT ALSO MAKES IT VERY DIFFICULT TO MAKE INFERENCES SYMBOLIC LOGIC IS A SYNTACTICALLY UNAMBIGIOUS KNOWLEDGE REPRESENTATION LANGUAGE (ORIGINALLY DEVELOPED IN AN ATTEMPT TO FORMALIZE MATHEMATICAL REASONING)
    • KNOWLEDGE REPRESENTATION TECHNIQUES IN AI PROPOSITIONAL LOGIC DECLARATIVE STATEMENT ~ -> NEGATION -> -> IMPLICATION ↔ -> IMPLIES AND IMPLIED BY V -> DISJUNCTION ^ -> CONJUNCTION PROPOSITIONAL LOGIC = SENTENCES REPRESENT WHOLE PROPOSITIONS “ 2 IS PRIME.” P “ I ATE BREAKFAST TODAY.” Q
    • SYNTAX SYNTAX = HOW A SENTENCE LOOKS LIKE SENTENCE -> ATOMICSENTENCE | COMPLEXSENTENCE ATOMICSENTENCE -> T(RUE) | F(ALSE) | SYMBOLS COMPLEXSENTENCE -> ( SENTENCE ) | NOT SENTENCE | CONNECTIVE -> AND | OR | IMPLIES | EQUIV(ALENT) SENTENCE CONNECTIVE SENTENCE SYMBOLS -> P | Q | R | ... PRECEDENCE: NOT AND OR IMPLIES EQUIVALENT CONJUNCTION DISJUNCTION IMPLICATION EQUIVALENCE NEGATION
    • SEMANTICS SEMANTICS = WHAT A SENTENCE MEANS INTERPRETATION: ASSIGNS EACH SYMBOL A TRUTH VALUE, EITHER T(RUE) OR F(ALSE) THE TRUTH VALUE OF T(RUE) IS T(RUE) THE TRUTH VALUE OF F(ALSE) IS F(ALSE) TRUTH TABLES (“COMPOSITIONAL SEMANTICS”) THE MEANING OF A SENTENCE IS A FUNCTION OF THE MEANING OF ITS PARTS
    • TERMINOLOGY A SENTENCE IS VALID IF IT IS TRUE UNDER ALL POSSIBLE ASSIGNMENTS OF TRUE/FALSE TO ITS PROPOSITIONAL VARIABLES (E.G. P _: P ) VALID SENTENCES ARE ALSO REFERRED TO AS TAUTOLOGIES A SENTENCE IS SATISFIABLE IF AND ONLY IF THERE IS SOME ASSIGNMENT OF TRUE/FALSE TO ITS PROPOSITIONAL VARIABLES FOR WHICH THE SENTENCE IS TRUE A SENTENCE IS UNSATISFIABLE IF AND ONLY IF IT IS NOT SATISFIABLE (E.G. P ^: P )
    • EXAMPLES EITHER I GO TO THE MOVIES OR I GO SWIMMING 2 IS PRIME IMPLIES THAT 2 IS EVEN 2 IS ODD IMPLIES THAT 3 IS EVEN (INCLUSIVE VS. EXCLUSIVE OR) (IMPLICATION DOES NOT IMPLY CAUSALITY) (FALSE IMPLIES EVERYTHING)
    • SEMANTIC NETWORKS L GRAPH STRUCTURES THAT ENCODE TAXONOMIC KNOWLEDGE OF OBJECTS AND THEIR PROPERTIES – OBJECTS REPRESENTED AS NODES – RELATIONS REPRESENTED AS LABELED EDGES L INHERITANCE = FORM OF INFERENCE IN WHICH SUBCLASSES INHERIT PROPERTIES OF SUPERCLASSES
    • FRAMES A LIMITATION OF SEMANTIC NETWORKS IS THAT ADDITIONAL STRUCTURE IS OFTEN NECESSARY TO DISTINGUISH – STATEMENTS ABOUT AN OBJECT’S RELATIONSHIPS – PROPERTIES OF THE OBJECT A FRAME IS A NODE WITH ADDITIONAL STRUCTURE THAT FACILITATES DIFFERENTIATING RELATIONSHIPS BETWEEN OBJECTS AND PROPERTIES OF OBJECTS. CALLED A “SLOT-AND-FILLER” REPRESENTATION
    • NORMAL FORM IN PREDICATE LOGIC: RULE:- 1.      REPLACE AND BY USING EQUIVALENT FORMULAS. 2.      REPEATED USE OF NEGATION ~ (~ P)=F.DEMORGAN’S LAW TO BRING NEGATION IN FRONT OF EACH ATOM. ~ (GF)= ~G~F.USE ~X F(X)= X~F(X) AND ~XF(X) = X~F(X) THEN USE ALL THE EQUIVALENT EXPRESSIONS TO BRING THE QUANTITIES IN FRONT OF THE EXPRESSIONS
    • RESOLUTION IN PREDICATE LOGIC
      • I) R(A)
      • II) R(X) M(X,B)
      • FIRST REPLACE A IN PLACE OF X IN 2 ND PREMISE AND CONCLUDE M(A,B).
      • E.G.  
      • MARCUS WAS A MAN. MAN (MARCUS)
      • MARCUS WAS A POMPEIAN. POMPEIAN (MARCUS)
      • CAESAR WAS A RULER. RULER (CAESAR)
    • NONMONOTONIC REASONING
      • COLLECTION OF TRUE FACTS NEVER DECREASES
      • FACTS CHANGES WITH TIME
      • ACCORDING TO THE HUMAN PROBLEM SOLVING APPROACH THE TRUTH STATUS OF THE COLLECTED FACTS MAY BE REVISED BASED ON CONTRARY EVIDENCES.
      • HENCE THE NONMONOTONIC REASONING SYSTEM IS MORE EFFECTIVE IN MANY PRACTICAL PROBLEMS SOLVING SITUATIONS.
    • PRINCIPLES OF NMRS
      • IF X IS NOT KNOWN, THEN CONCLUDE Y
      • IF X CANNOT BE PROVED, THEN CONCLUDE Y
      • E.G. 1: TO BUILD A PROGRAM THAT GENERATES A SOLUTION TO A FAIRLY A SIMPLE PROBLEM.
      • E.G. 2: TO FIND OUT A TIME AT WHICH THREE BUSY CAN ALL ATTAIN A MEETING
      • DEPENDENCY-DIRECTED BACKTRACKING
    • NECESSITY OF NMR
      •  
      • THE PRESENCE OF INCOMPLETE INFORMATION REQUIRES DEFAULT REASONING.
      • A CHANGING WORLD MUST BE DECIDED BY A CHANGING DATABASE.
      • GENERATING A COMPLETE SOLUTION TO A PROBLEM MAY REQUIRE TEMPORARY ASSUMPTION ABOUT PARTIAL SOLUTION.
    • APPLICATIONS OF AI
      • PATTERN RECOGNISATION
      • ROBOTICS
      • NATURAL LANGUAGE PROCESSING
      • ARTIFICIAL LIFE
      • APPLICATIONS OF AI, BY INTELLIGENT ALGORITHMS
      • 5.1 MECHANICAL TRANSLATION
      • 5.2 GAME PLAYING
      • 5.3 COMPUTER VISION
      • 5.4 COMPUTER HEARING
      • 5.5 CREATING ORIGINAL THOUGHTS OR WORKS OF ART
      • 5.6 ANALOGICAL THINKING LEARNING
    • FUNDAMENTAL PROBLEMS OF AI 1. THE ABILITY OF EVEN THE MOST ADVANCED OF CURRENTLY EXISTING COMPUTER SYSTEMS TO ACQUIRE INFORMATION ALL BY ITSELF IS STILL EXTREMELY LIMITED. 2. IT IS NOT OBVIOUS THAT ALL HUMAN KNOWLEDGE IS ENCODABLE IN “INFORMATION STRUCTURES” HOWEVER COMPLEX. E.G. A HUMAN MAY KNOW, FOR EXAMPLE, JUST WHAT KIND OF EMOTIONAL IMPACT TOUCHING ANOTHER PERSON’S HAND WILL HAVE BOTH ON THE OTHER PERSON AND ON HIMSELF. 3. THE HAND-TOUCHING EXAMPLE WILL DO HERE TOO, THERE ARE SOME THINGS PEOPLE COME TO KNOW ONLY AS A CONSEQUENCE OF HAVING BEEN TREATED AS HUMAN BEINGS BY OTHER HUMAN BEINGS. 4. THE KINDS OF KNOWLEDGE THAT APPEAR SUPERFICIALLY TO BE COMMUNICABLE FROM ONE HUMAN BEING TO ANOTHER IN LANGUAGE ALONE ARE IN FACT NOT ALTOGETHER SO COMMUNICABLE
    • THANK YOU!!!