The document discusses inference rules for quantifiers in first-order logic. It describes the rules of universal instantiation and existential instantiation. Universal instantiation allows inferring sentences by substituting terms for variables, while existential instantiation replaces a variable with a new constant symbol. The document also introduces unification, which finds substitutions to make logical expressions identical. Generalized modus ponens is presented as a rule that lifts modus ponens to first-order logic by using unification to substitute variables.
The document describes logical agents and knowledge representation. It contains the following key points:
- Logical agents use knowledge representation and reasoning to solve problems and generate new knowledge. This enables intelligent behavior in partially observable environments.
- A knowledge-based agent's central component is its knowledge base, which contains sentences in a formal language that can be queried or added to.
- Wumpus World is described as an example environment, where the agent must navigate, avoid dangers, and find gold using limited sensory information and logical reasoning.
- Propositional and predicate logic are introduced as knowledge representation languages. Forward and backward chaining are also described as techniques for logical inference.
The document provides an overview of knowledge representation techniques. It discusses propositional logic, including syntax, semantics, and inference rules. Propositional logic uses atomic statements that can be true or false, connected with operators like AND and OR. Well-formed formulas and normal forms are explained. Forward and backward chaining for rule-based reasoning are summarized. Examples are provided to illustrate various concepts.
This document discusses uncertainty and probability theory. It begins by explaining sources of uncertainty for autonomous agents from limited sensors and an unknown future. It then covers representing uncertainty with probabilities and Bayes' rule for updating beliefs. Examples show inferring diagnoses from symptoms using conditional probabilities. Independence is described as reducing the information needed for joint distributions. The document emphasizes probability theory and Bayesian reasoning for handling uncertainty.
The document discusses uncertainty and probabilistic reasoning. It describes sources of uncertainty like partial information, unreliable information, and conflicting information from multiple sources. It then discusses representing and reasoning with uncertainty using techniques like default logic, rules with probabilities, and probability theory. The key approaches covered are conditional probability, independence, conditional independence, and using Bayes' rule to update probabilities based on new evidence.
This document provides an overview of predicate logic and various techniques for representing knowledge and drawing inferences using predicate logic, including:
- Representing facts as logical statements using predicates, variables, and quantifiers.
- Distinguishing between propositional logic and predicate logic and their abilities to represent objects and relationships.
- Techniques like resolution and Skolem functions that allow inferring new statements from existing ones in a logical and systematic way.
- How computable functions and predicates allow representing relationships that have infinitely many instances, like greater-than, in a computable way.
The document discusses these topics at a high-level and provides examples to illustrate key concepts in predicate logic and automated reasoning.
The document discusses sources and approaches to handling uncertainty in artificial intelligence. It provides examples of uncertain inputs, knowledge, and outputs in AI systems. Common methods for representing and reasoning with uncertain data include probability, Bayesian belief networks, hidden Markov models, and temporal models. Effectively handling uncertainty through probability and inference allows AI to make rational decisions with imperfect knowledge.
The document discusses inference rules for quantifiers in first-order logic. It describes the rules of universal instantiation and existential instantiation. Universal instantiation allows inferring sentences by substituting terms for variables, while existential instantiation replaces a variable with a new constant symbol. The document also introduces unification, which finds substitutions to make logical expressions identical. Generalized modus ponens is presented as a rule that lifts modus ponens to first-order logic by using unification to substitute variables.
The document describes logical agents and knowledge representation. It contains the following key points:
- Logical agents use knowledge representation and reasoning to solve problems and generate new knowledge. This enables intelligent behavior in partially observable environments.
- A knowledge-based agent's central component is its knowledge base, which contains sentences in a formal language that can be queried or added to.
- Wumpus World is described as an example environment, where the agent must navigate, avoid dangers, and find gold using limited sensory information and logical reasoning.
- Propositional and predicate logic are introduced as knowledge representation languages. Forward and backward chaining are also described as techniques for logical inference.
The document provides an overview of knowledge representation techniques. It discusses propositional logic, including syntax, semantics, and inference rules. Propositional logic uses atomic statements that can be true or false, connected with operators like AND and OR. Well-formed formulas and normal forms are explained. Forward and backward chaining for rule-based reasoning are summarized. Examples are provided to illustrate various concepts.
This document discusses uncertainty and probability theory. It begins by explaining sources of uncertainty for autonomous agents from limited sensors and an unknown future. It then covers representing uncertainty with probabilities and Bayes' rule for updating beliefs. Examples show inferring diagnoses from symptoms using conditional probabilities. Independence is described as reducing the information needed for joint distributions. The document emphasizes probability theory and Bayesian reasoning for handling uncertainty.
The document discusses uncertainty and probabilistic reasoning. It describes sources of uncertainty like partial information, unreliable information, and conflicting information from multiple sources. It then discusses representing and reasoning with uncertainty using techniques like default logic, rules with probabilities, and probability theory. The key approaches covered are conditional probability, independence, conditional independence, and using Bayes' rule to update probabilities based on new evidence.
This document provides an overview of predicate logic and various techniques for representing knowledge and drawing inferences using predicate logic, including:
- Representing facts as logical statements using predicates, variables, and quantifiers.
- Distinguishing between propositional logic and predicate logic and their abilities to represent objects and relationships.
- Techniques like resolution and Skolem functions that allow inferring new statements from existing ones in a logical and systematic way.
- How computable functions and predicates allow representing relationships that have infinitely many instances, like greater-than, in a computable way.
The document discusses these topics at a high-level and provides examples to illustrate key concepts in predicate logic and automated reasoning.
The document discusses sources and approaches to handling uncertainty in artificial intelligence. It provides examples of uncertain inputs, knowledge, and outputs in AI systems. Common methods for representing and reasoning with uncertain data include probability, Bayesian belief networks, hidden Markov models, and temporal models. Effectively handling uncertainty through probability and inference allows AI to make rational decisions with imperfect knowledge.
Knowledge Representation in Artificial intelligence Yasir Khan
This document discusses different methods of knowledge representation in artificial intelligence, including logical representations, semantic networks, production rules, and frames. Logical representations use formal logics like propositional logic and first-order predicate logic to represent facts and relationships. Semantic networks represent knowledge graphically as nodes and edges to model concepts and their relationships. Production rules represent knowledge as condition-action pairs to model problem-solving. Frames represent stereotyped situations as templates with slots to model attributes and behaviors. Choosing the right knowledge representation method is important for building successful AI systems.
Artificial Intelligence: Natural Language ProcessingFrank Cunha
This document provides an overview of natural language processing (NLP) through a presentation. It defines NLP and its subfields. NLP allows computers to analyze, understand, and generate human language. The presentation discusses how NLP is used in commercial applications and emerging technologies. It outlines Rogers' characteristics of innovation diffusion and shows where NLP currently sits on the technology S-curve. The summary is that AI and NLP will transform many industries through applications involving user interfaces, actions, analytics, and machine-human interactions.
The document discusses first-order logic (FOL) and its advantages over propositional logic for representing knowledge. It introduces the basic elements of FOL syntax, such as constants, predicates, functions, variables, and connectives. It provides examples of FOL expressions and discusses how objects and relations between objects can be represented. It also covers quantification in FOL using universal and existential quantifiers.
This document discusses inference in first-order logic. It defines sound and complete inference and introduces substitution. It then discusses propositional vs first-order inference and introduces universal and existential quantifiers. The key techniques of first-order inference are unification, which finds substitutions to make logical expressions identical, and forward chaining inference, which applies rules like modus ponens to iteratively derive new facts from a knowledge base.
Artificial Intelligence: Introduction, Typical Applications. State Space Search: Depth Bounded
DFS, Depth First Iterative Deepening. Heuristic Search: Heuristic Functions, Best First Search,
Hill Climbing, Variable Neighborhood Descent, Beam Search, Tabu Search. Optimal Search: A
*
algorithm, Iterative Deepening A*
, Recursive Best First Search, Pruning the CLOSED and OPEN
Lists
This document provides an overview and introduction to the course "Knowledge Representation & Reasoning" taught by Ms. Jawairya Bukhari. It discusses the aims of developing skills in knowledge representation and reasoning using different representation methods. It outlines prerequisites like artificial intelligence, logic, and programming. Key topics covered include symbolic and non-symbolic knowledge representation methods, types of knowledge, languages for knowledge representation like propositional logic, and what knowledge representation encompasses.
Truth maintenance systems (TMS) are used to represent beliefs and their justifications in a knowledge base and maintain consistency when new information is added or inferred facts are retracted. A TMS models inferences as a dependency network and uses an algorithm to track which inferences need to be retracted if assumptions change. It differs from traditional logic which treats all facts equally. Nonmonotonic logics also use TMS to represent default reasoning and maintain beliefs when assumptions are defeated. Fuzzy logic uses similar concepts to handle imprecise values and measurements in systems like control applications.
The document discusses knowledge representation using propositional logic and predicate logic. It begins by explaining the syntax and semantics of propositional logic for representing problems as logical theorems to prove. Predicate logic is then introduced as being more versatile than propositional logic for representing knowledge, as it allows quantifiers and relations between objects. Examples are provided to demonstrate how predicate logic can formally represent statements involving universal and existential quantification.
This document provides an introduction to logic, including propositional logic and predicate calculus. It defines key concepts such as logical values, propositions, operators, truth tables, logical expressions, worlds, models, inference rules, quantification, and definitions. Propositional logic manipulates true and false values using operators like AND and OR. Predicate calculus extends this to allow predicates, constants, functions, and quantification over variables. Inference involves applying rules to derive new statements, but the search space grows too large to feasibly perform by hand.
The document introduces fuzzy set theory as an extension of classical set theory that allows for elements to have varying degrees of membership rather than binary membership. It discusses key concepts such as fuzzy sets, membership functions, fuzzy logic, fuzzy rules, and fuzzy inference. Fuzzy set theory provides a framework for modeling imprecise and uncertain concepts that are common in human reasoning.
This document provides an overview of Chapter 14 on probabilistic reasoning and Bayesian networks from an artificial intelligence textbook. It introduces Bayesian networks as a way to represent knowledge over uncertain domains using directed graphs. Each node corresponds to a variable and arrows represent conditional dependencies between variables. The document explains how Bayesian networks can encode a joint probability distribution and represent conditional independence relationships. It also discusses techniques for efficiently representing conditional distributions in Bayesian networks, including noisy logical relationships and continuous variables. The chapter covers exact and approximate inference methods for Bayesian networks.
Artificial Intelligence (AI) | Prepositional logic (PL)and first order predic...Ashish Duggal
The following are the topics in this presentation Prepositional Logic (PL) and First-order Predicate Logic (FOPL) is used for knowledge representation in artificial intelligence (AI).
There are also sub-topics in this presentation like logical connective, atomic sentence, complex sentence, and quantifiers.
This PPT is very helpful for Computer science and Computer Engineer
(B.C.A., M.C.A., B.TECH. , M.TECH.)
P, NP, NP-Complete, and NP-Hard
Reductionism in Algorithms
NP-Completeness and Cooks Theorem
NP-Complete and NP-Hard Problems
Travelling Salesman Problem (TSP)
Travelling Salesman Problem (TSP) - Approximation Algorithms
PRIMES is in P - (A hope for NP problems in P)
Millennium Problems
Conclusions
This document provides an introduction to Bayesian networks. It begins by explaining Bayesian networks using a medical example about determining the likelihood a patient has anthrax given various observed symptoms. It then provides a probability primer covering random variables, conditional probability, and independence. The document defines Bayesian networks as consisting of a directed acyclic graph and conditional probability tables at each node. It explains how Bayesian networks compactly represent joint probability distributions and allow for inference queries. The challenges of exact versus approximate inference in large networks are also noted.
The document discusses Turing machines and their properties. It introduces the Church-Turing thesis that any problem that can be solved by an algorithm can be modeled by a Turing machine. It then describes different types of Turing machines, such as multi-track, nondeterministic, two-way, multi-tape, and multidimensional Turing machines. The document provides examples of Turing machines that accept specific languages and evaluate mathematical functions through their transition tables and diagrams.
Knowledge representation and Predicate logicAmey Kerkar
1. The document discusses knowledge representation and predicate logic.
2. It explains that knowledge representation involves representing facts through internal representations that can then be manipulated to derive new knowledge. Predicate logic allows representing objects and relationships between them using predicates, quantifiers, and logical connectives.
3. Several examples are provided to demonstrate representing simple facts about individuals as predicates and using quantifiers like "forall" and "there exists" to represent generalized statements.
The document discusses procedural versus declarative knowledge representation and how logic programming languages like Prolog allow knowledge to be represented declaratively through logical rules. It also covers topics like forward and backward reasoning, matching rules to facts in working memory, and using control knowledge to guide the problem solving process. Logic programming represents knowledge through Horn clauses and uses backward chaining inference to attempt to prove goals.
Bayesian Networks - A Brief IntroductionAdnan Masood
- A Bayesian network is a graphical model that depicts probabilistic relationships among variables. It represents a joint probability distribution over variables in a directed acyclic graph with conditional probability tables.
- A Bayesian network consists of a directed acyclic graph whose nodes represent variables and edges represent probabilistic dependencies, along with conditional probability distributions that quantify the relationships.
- Inference using a Bayesian network allows computing probabilities like P(X|evidence) by taking into account the graph structure and probability tables.
The document discusses propositional logic as a knowledge representation language. It defines key concepts in propositional logic including: syntax, semantics, validity, satisfiability, interpretation, models, and entailment. It explains that propositional logic uses symbols to represent facts about the world and connectives to combine symbols into sentences. Sentences can then be evaluated based on the truth values assigned to symbols to determine if the overall sentence is true or false. Propositional logic allows new sentences to be deduced from existing sentences through inference rules while maintaining logical validity.
Knowledge Representation in Artificial intelligence Yasir Khan
This document discusses different methods of knowledge representation in artificial intelligence, including logical representations, semantic networks, production rules, and frames. Logical representations use formal logics like propositional logic and first-order predicate logic to represent facts and relationships. Semantic networks represent knowledge graphically as nodes and edges to model concepts and their relationships. Production rules represent knowledge as condition-action pairs to model problem-solving. Frames represent stereotyped situations as templates with slots to model attributes and behaviors. Choosing the right knowledge representation method is important for building successful AI systems.
Artificial Intelligence: Natural Language ProcessingFrank Cunha
This document provides an overview of natural language processing (NLP) through a presentation. It defines NLP and its subfields. NLP allows computers to analyze, understand, and generate human language. The presentation discusses how NLP is used in commercial applications and emerging technologies. It outlines Rogers' characteristics of innovation diffusion and shows where NLP currently sits on the technology S-curve. The summary is that AI and NLP will transform many industries through applications involving user interfaces, actions, analytics, and machine-human interactions.
The document discusses first-order logic (FOL) and its advantages over propositional logic for representing knowledge. It introduces the basic elements of FOL syntax, such as constants, predicates, functions, variables, and connectives. It provides examples of FOL expressions and discusses how objects and relations between objects can be represented. It also covers quantification in FOL using universal and existential quantifiers.
This document discusses inference in first-order logic. It defines sound and complete inference and introduces substitution. It then discusses propositional vs first-order inference and introduces universal and existential quantifiers. The key techniques of first-order inference are unification, which finds substitutions to make logical expressions identical, and forward chaining inference, which applies rules like modus ponens to iteratively derive new facts from a knowledge base.
Artificial Intelligence: Introduction, Typical Applications. State Space Search: Depth Bounded
DFS, Depth First Iterative Deepening. Heuristic Search: Heuristic Functions, Best First Search,
Hill Climbing, Variable Neighborhood Descent, Beam Search, Tabu Search. Optimal Search: A
*
algorithm, Iterative Deepening A*
, Recursive Best First Search, Pruning the CLOSED and OPEN
Lists
This document provides an overview and introduction to the course "Knowledge Representation & Reasoning" taught by Ms. Jawairya Bukhari. It discusses the aims of developing skills in knowledge representation and reasoning using different representation methods. It outlines prerequisites like artificial intelligence, logic, and programming. Key topics covered include symbolic and non-symbolic knowledge representation methods, types of knowledge, languages for knowledge representation like propositional logic, and what knowledge representation encompasses.
Truth maintenance systems (TMS) are used to represent beliefs and their justifications in a knowledge base and maintain consistency when new information is added or inferred facts are retracted. A TMS models inferences as a dependency network and uses an algorithm to track which inferences need to be retracted if assumptions change. It differs from traditional logic which treats all facts equally. Nonmonotonic logics also use TMS to represent default reasoning and maintain beliefs when assumptions are defeated. Fuzzy logic uses similar concepts to handle imprecise values and measurements in systems like control applications.
The document discusses knowledge representation using propositional logic and predicate logic. It begins by explaining the syntax and semantics of propositional logic for representing problems as logical theorems to prove. Predicate logic is then introduced as being more versatile than propositional logic for representing knowledge, as it allows quantifiers and relations between objects. Examples are provided to demonstrate how predicate logic can formally represent statements involving universal and existential quantification.
This document provides an introduction to logic, including propositional logic and predicate calculus. It defines key concepts such as logical values, propositions, operators, truth tables, logical expressions, worlds, models, inference rules, quantification, and definitions. Propositional logic manipulates true and false values using operators like AND and OR. Predicate calculus extends this to allow predicates, constants, functions, and quantification over variables. Inference involves applying rules to derive new statements, but the search space grows too large to feasibly perform by hand.
The document introduces fuzzy set theory as an extension of classical set theory that allows for elements to have varying degrees of membership rather than binary membership. It discusses key concepts such as fuzzy sets, membership functions, fuzzy logic, fuzzy rules, and fuzzy inference. Fuzzy set theory provides a framework for modeling imprecise and uncertain concepts that are common in human reasoning.
This document provides an overview of Chapter 14 on probabilistic reasoning and Bayesian networks from an artificial intelligence textbook. It introduces Bayesian networks as a way to represent knowledge over uncertain domains using directed graphs. Each node corresponds to a variable and arrows represent conditional dependencies between variables. The document explains how Bayesian networks can encode a joint probability distribution and represent conditional independence relationships. It also discusses techniques for efficiently representing conditional distributions in Bayesian networks, including noisy logical relationships and continuous variables. The chapter covers exact and approximate inference methods for Bayesian networks.
Artificial Intelligence (AI) | Prepositional logic (PL)and first order predic...Ashish Duggal
The following are the topics in this presentation Prepositional Logic (PL) and First-order Predicate Logic (FOPL) is used for knowledge representation in artificial intelligence (AI).
There are also sub-topics in this presentation like logical connective, atomic sentence, complex sentence, and quantifiers.
This PPT is very helpful for Computer science and Computer Engineer
(B.C.A., M.C.A., B.TECH. , M.TECH.)
P, NP, NP-Complete, and NP-Hard
Reductionism in Algorithms
NP-Completeness and Cooks Theorem
NP-Complete and NP-Hard Problems
Travelling Salesman Problem (TSP)
Travelling Salesman Problem (TSP) - Approximation Algorithms
PRIMES is in P - (A hope for NP problems in P)
Millennium Problems
Conclusions
This document provides an introduction to Bayesian networks. It begins by explaining Bayesian networks using a medical example about determining the likelihood a patient has anthrax given various observed symptoms. It then provides a probability primer covering random variables, conditional probability, and independence. The document defines Bayesian networks as consisting of a directed acyclic graph and conditional probability tables at each node. It explains how Bayesian networks compactly represent joint probability distributions and allow for inference queries. The challenges of exact versus approximate inference in large networks are also noted.
The document discusses Turing machines and their properties. It introduces the Church-Turing thesis that any problem that can be solved by an algorithm can be modeled by a Turing machine. It then describes different types of Turing machines, such as multi-track, nondeterministic, two-way, multi-tape, and multidimensional Turing machines. The document provides examples of Turing machines that accept specific languages and evaluate mathematical functions through their transition tables and diagrams.
Knowledge representation and Predicate logicAmey Kerkar
1. The document discusses knowledge representation and predicate logic.
2. It explains that knowledge representation involves representing facts through internal representations that can then be manipulated to derive new knowledge. Predicate logic allows representing objects and relationships between them using predicates, quantifiers, and logical connectives.
3. Several examples are provided to demonstrate representing simple facts about individuals as predicates and using quantifiers like "forall" and "there exists" to represent generalized statements.
The document discusses procedural versus declarative knowledge representation and how logic programming languages like Prolog allow knowledge to be represented declaratively through logical rules. It also covers topics like forward and backward reasoning, matching rules to facts in working memory, and using control knowledge to guide the problem solving process. Logic programming represents knowledge through Horn clauses and uses backward chaining inference to attempt to prove goals.
Bayesian Networks - A Brief IntroductionAdnan Masood
- A Bayesian network is a graphical model that depicts probabilistic relationships among variables. It represents a joint probability distribution over variables in a directed acyclic graph with conditional probability tables.
- A Bayesian network consists of a directed acyclic graph whose nodes represent variables and edges represent probabilistic dependencies, along with conditional probability distributions that quantify the relationships.
- Inference using a Bayesian network allows computing probabilities like P(X|evidence) by taking into account the graph structure and probability tables.
The document discusses propositional logic as a knowledge representation language. It defines key concepts in propositional logic including: syntax, semantics, validity, satisfiability, interpretation, models, and entailment. It explains that propositional logic uses symbols to represent facts about the world and connectives to combine symbols into sentences. Sentences can then be evaluated based on the truth values assigned to symbols to determine if the overall sentence is true or false. Propositional logic allows new sentences to be deduced from existing sentences through inference rules while maintaining logical validity.
This document discusses various logics, including Aristotelian logic, Euclidean geometry, propositional logic, and first-order logic. It examines concepts such as validity, soundness, interpretations, and possible worlds in different logical systems. Key points made include that Aristotelian logic has 15 unconditionally valid categorical syllogisms and 9 conditionally valid ones, and that interpretations assign meanings to symbols in a logic to determine if statements are true or valid under that interpretation.
This document provides an introduction to mathematical analysis by outlining key topics including:
- An overview of analysis and its focus on real-valued functions of a single real variable and their analytic properties like limits, continuity, and differentiability.
- A review of logic including definitions of statements, connectives, implications, and equivalences.
- An introduction to proof, discussing the difference between conjectures, theorems, lemmas, and corollaries and how proofs demonstrate statements are universally true or find counter examples.
Propositional logic and predicate logic are knowledge representation languages used in AI. Propositional logic uses symbols to represent simple statements, while predicate logic (first-order logic) is more expressive and commonly used, using predicates, quantifiers and variables to represent relationships about objects in the world. Some key aspects of first-order logic include its syntax, semantics, how it can represent statements about universality and existence using quantifiers, and how it can be used to formally represent real-world knowledge and relationships.
This document provides an overview of natural language processing and planning topics including:
- NLP tasks like parsing, machine translation, and information extraction.
- The components of a planning system including the planning agent, state and goal representations, and planning techniques like forward and backward chaining.
- Methods for natural language processing including pattern matching, syntactic analysis, and the stages of NLP like phonological, morphological, syntactic, semantic, and pragmatic analysis.
Adnan: Introduction to Natural Language Processing Mustafa Jarrar
This document provides an introduction to natural language processing (NLP). It discusses key topics in NLP including languages and intelligence, the goals of NLP, applications of NLP, and general themes in NLP like ambiguity in language and statistical vs rule-based methods. The document also previews specific NLP techniques that will be covered like part-of-speech tagging, parsing, grammar induction, and finite state analysis. Empirical approaches to NLP are discussed including analyzing word frequencies in corpora and addressing data sparseness issues.
The document provides an introduction to knowledge representation and reasoning in artificial intelligence. It discusses how knowledge representation focuses on studying what knowledge agents need to behave intelligently rather than studying the agents themselves. The key aspects of knowledge representation covered include knowledge, representation, and reasoning. It also describes different methods of knowledge representation like logical representation, semantic networks, frames, and production rules.
Analyzing Arguments during a Debate using Natural Language Processing in PythonAbhinav Gupta
This presentation will guide you through the application of Python NLP Techniques to analyze arguments during a debate and define a strategy to figure out the winner of the debate on the basis of strength and relevance of the arguments.
This is made for PyCon India 2015.
For details : https://in.pycon.org/cfp/pycon-india-2015/proposals/analyzing-arguments-during-a-debate-using-natural-language-processing-in-python/
Contact me : abhinav.gpt3@gmail.com
The document discusses the basics of propositional logic, including syntax, semantics, models, inference, validity, and equivalence. It introduces logical agents that apply inference to a knowledge base to derive new information. Several concepts are explained, such as entailment, soundness, completeness, and how propositional logic differs from first-order predicate logic in expressive power.
This document provides information about an upcoming midterm exam and lecture topics related to complexity theory. It discusses:
- The HW and midterm being due next week, with the HW preparing students for the exam.
- Topics from recent and upcoming lectures that will be covered on the midterm, including definitions of P, NP, NP-hard, and NP-complete problems.
- Complexity classes like P and NP and examples of problems in each, such as the complexity of primality testing. Reducibility between problems is also covered.
- Upcoming lecture topics on complexity examples and analyzing exponential time complexity through examples like computing Fibonacci numbers.
This document provides information about an upcoming midterm exam and lecture topics related to complexity theory. It notes that the homework is due next week and is important preparation for the midterm on Tuesday. Today's and parts of Thursday's lecture will be covered in the midterm. Upcoming lectures will focus on definitions of P, NP, NP-hard and NP-complete problems, and examples of complexity classes like polynomial time. The document concludes with examples of reducing problems like 3-SAT to CLIQUE to show NP-completeness.
This document discusses knowledge-based logical agents and concepts from propositional logic. It introduces how knowledge bases represent what agents know and can be used to determine actions. Propositional logic syntax and semantics are explained using sentences about pits and breezes in the Wumpus world. Key concepts in logic are defined such as logical equivalence, validity, satisfiability and the relationship between inference, entailment and sound/complete derivation of sentences.
This document provides information on using active and passive voice when describing procedures and methods in academic writing. It discusses:
- Using passive voice in the methodology section to report activities in simple past or present perfect tense.
- The differences between active and passive voice, with examples showing how the subject and object are positioned differently.
- How verbs change when converting between active and passive voice, such as adding "been" after "have" in the present perfect passive form.
- Ways to express reasons and explanations in the methods section, including using transitions and subordinators/conjunctions to connect ideas.
- The importance of clearly expressing the development and changes in methods.
This document introduces algorithms that are polynomial time versus non-polynomial time and defines NP-complete problems. It discusses that NP-complete problems include Satisfiability (SAT), Traveling Salesman, Knapsack and Clique problems. These problems are difficult to solve in polynomial time and are mapped to each other through polynomial time reductions. While we can solve them in exponential time, finding a polynomial time algorithm would mean P=NP.
The document defines logic as the study of distinguishing good reasoning from bad, and defines propositions as statements that can be either true or false. It explains that propositions are the basic building blocks of reasoning and logic aims to determine what is true and false. Examples of different types of propositions like simple, compound, and conditional propositions are provided to illustrate how propositions work.
The document provides an introduction to formal logic. It discusses how to formulate valid arguments through propositional logic and syllogistic logic. Propositional logic uses truth tables to evaluate combinations of propositions and operators like negation and conjunction. Syllogistic logic examines implications of general statements using domains and categories. The key rules of inference for valid arguments are hypothetical syllogism, modus ponens, and modus tollens.
The document discusses knowledge representation and reasoning in artificial intelligence. It covers the following key points in 3 sentences:
Intelligent agents should have the capacity for perceiving, representing knowledge, reasoning about what they know, and acting. Knowledge representation involves representing an understanding of the world, while reasoning involves inferring implications of what is known. Logic provides a way to represent and reason about knowledge through specifying a logical language with syntax, semantics, and inference rules.
This document provides an overview of knowledge representation in artificial intelligence. It discusses how knowledge representation is used to formalize problems so they can be solved through search techniques. Good knowledge representations reduce problems to search problems, while bad representations prevent problem solving. Logical expressions are presented as one way to represent states, actions, and allow an agent to deduce hidden properties and appropriate actions. The document then discusses the Wumpus World environment as an example problem domain. It describes the percepts, properties, and actions in the Wumpus World. Finally, it discusses using first-order logic to represent facts and inferences about the Wumpus World.
Similar to Knnowledge representation and logic lec 11 to lec 15 (20)
PRISMOID is a comprehensive 3D structure database for post-translational modifications and mutations with functional impact. It contains over 17,000 PTM sites from nearly 4,000 proteins annotated with 37 different types of PTMs. PRISMOID also annotates disease mutations affecting PTM sites. It collects protein structural features like secondary structure, solvent accessibility, and disorder regions. PRISMOID maps PTM sites from sequence databases to 3D protein structures from the PDB. It aims to provide an interactive resource for visualizing protein structures with PTMs and their associations with disease mutations.
GlyStruct is a machine learning model that uses structural properties of amino acid residues to predict glycated and non-glycated lysine residues with improved 10% performance over existing methods. It extracts features including secondary structure, accessible surface area, and local backbone torsion angles from a dataset of 1753 lysine sites using the SPIDER2 toolbox. A support vector machine classifier is trained on a 104-dimensional feature vector constructed from a 13 amino acid segment window to predict glycation.
This document summarizes recent research on COVID-19. It discusses the structure of coronaviruses and how they interact with the ACE2 receptor to infect cells. It also summarizes research using protein structure prediction, molecular modeling, and next generation sequencing to understand the virus and identify potential drug targets. Post-translational glycosylation of the spike protein is discussed and how it may help the virus evade immune detection.
This presentation discusses using a Long Short Term Memory (LSTM) neural network model to predict global COVID-19 cases. It provides background on COVID-19 and reviews literature applying machine learning to predict disease spread. The methodology collects real-time case data, performs exploratory analysis, and trains an LSTM on 90% of normalized data. The model predicts daily cases for the held-out 10% and is evaluated on root mean square error. While the deep learning model captures some patterns, predictions are difficult given the pandemic's volatility. Social distancing remains important for health.
Recurrent neural networks (RNNs) and long short-term memory (LSTM) networks can be used for sequence modeling tasks like predicting the next word. RNNs apply the same function to each element of a sequence but struggle with long-term dependencies. LSTMs address this with a gated cell that can maintain information over many time steps by optionally adding, removing, or updating cell state. LSTMs are better for tasks like language modeling since they can remember inputs from much earlier in the sequence. RNNs and LSTMs have applications in areas like music generation, machine translation, and predictive modeling.
Characterization and identification of lysine succinylation sites basedSubash Chandra Pakhrin
This document summarizes a research paper that uses deep learning methods to characterize and identify lysine succinylation sites in proteins. The researchers extracted four types of sequence-based features from protein data and used them to train convolutional neural networks (CNNs) for binary classification of succinylation sites versus non-sites. They evaluated ten-fold cross-validation of CNN models trained on different attribute combinations. The best performing model used position-specific scoring matrices and achieved an accuracy of 88%. However, the paper did not consider additional physicochemical properties that could improve predictions.
Convolutional neural networks (CNNs) are a type of neural network used in image recognition and processing. CNNs use convolutional layers that apply filters to input volumes to extract features at different spatial locations. Backpropagation is used to train CNNs by propagating errors backwards. CNNs have been successfully applied to tasks like ImageNet classification, object detection, and image captioning. Convolutional layers are the core building blocks of CNNs, applying filters to input volumes to produce activation maps as output.
Convolutional neural networks (CNNs) are a type of neural network used in image recognition and processing. CNNs use convolutional layers that apply filters to input volumes to extract features at different spatial locations. Backpropagation is used to train CNNs by propagating errors backwards. CNNs have been successfully applied to large-scale image classification tasks using datasets like ImageNet, with AlexNet achieving breakthrough results in 2012. CNNs employ several convolutional layers interspersed with activation functions to process input volumes while avoiding excessive spatial shrinking.
This document summarizes three AI lab experiments:
1) A monkey and banana problem solved using Prolog code to represent states and moves to get the monkey from the floor to the banana hanging from the roof.
2) A water jug problem to get exactly 2 liters of water in a 4 liter jug using a pump and two jugs of 4 and 3 liters. States are represented as jug volumes and moves involve filling, pouring, and emptying jugs.
3) A farmer, wolf, goat, and cabbage river crossing puzzle where the farmer must transport all items across the river in a boat that holds himself and one other item, avoiding unsafe states like leaving the wolf and goat alone.
The monkey is on the floor by the door, a block is on the floor by the window, and a banana is hanging from the roof in the middle of the room. The problem presented is how the monkey can get the banana.
The document discusses two modern techniques for determining system requirements: Joint Application Design (JAD) and prototyping. JAD involves collaborative workshops between users, analysts, and other stakeholders to jointly design systems. It leads to shorter development times and greater user satisfaction. Prototyping allows quickly building preliminary versions of a system to gather more requirements from user testing and feedback. Both techniques can effectively gather requirements while reducing analysis time compared to traditional methods.
This document discusses the process of identifying and selecting information systems development projects. It begins by outlining the three primary activities of project identification, classification and ranking, and selection. It then provides details on each step, including how potential projects are identified, common criteria for evaluating projects, and factors to consider when selecting projects. The deliverables are identified as a schedule of selected IS development projects. The outcomes include ensuring careful consideration was given to project selection and understanding how each project could help the organization.
This document provides an overview of intelligent agents. It defines key concepts such as agents, environments, rational agents, and bounded rationality. It discusses different types of agent architectures including reactive agents, state-based agents, deliberative agents, utility-based agents, and learning agents. The document also describes different types of environments based on their observability, determinism, episodicity, dynamism, continuity, and whether they involve single or multiple agents. Overall, the document aims to familiarize readers with intelligent agents and their components and characteristics.
The document describes various search algorithms used to solve pathfinding problems represented as graphs. It discusses uninformed searches like breadth-first search (BFS) and depth-first search (DFS) as well as methods for evaluating search performance in terms of completeness, time complexity, space complexity, and optimality. BFS uses a queue to expand shallow nodes first, guaranteeing an optimal solution but with exponential time and space costs. DFS uses a stack to expand deep nodes first, providing better space efficiency but not guaranteeing an optimal solution.
The document discusses various informed search algorithms including A*, greedy search, and uniform cost search. It provides instructional objectives for learning about heuristic functions, designing heuristics for problems, and comparing heuristic functions. Key aspects of A* search are summarized, including that it uses an admissible heuristic function to find optimal solutions, and conditions like admissibility and consistency that guarantee its optimality.
Hill climbing is a local search algorithm that starts with a random solution and iteratively makes small changes to improve the solution. It terminates when no further improvements can be made. Hill climbing can get stuck at local optima rather than finding the global optimum. Simulated annealing is similar to hill climbing but allows occasional "downhill moves" that worsen the solution based on a probability function involving the change in solution quality and temperature parameter. The temperature is gradually decreased, reducing the probability of downhill moves over time. This helps simulated annealing avoid local optima and find better solutions than hill climbing.
Two player games involve two players alternating turns trying to maximize or minimize the outcome of a game. The MiniMax algorithm is commonly used to choose the best move for a player in two player games by recursively evaluating all possible future moves. Alpha-beta pruning improves upon MiniMax by pruning branches that cannot influence the final outcome, allowing deeper search of the game tree within the same time limit.
This document discusses different types of intelligent agents and their architectures. It defines agents as entities that operate in an environment, perceiving it through sensors and acting upon it through effectors to achieve their goals. The document outlines stimulus-response, state-based, deliberative, utility-based, and learning agent architectures. It also discusses concepts like rational agents, bounded rationality, and PEAS (Performance, Environment, Actuators, Sensors) representations for defining agent tasks.
The document provides an introduction to state space search problems and algorithms. It discusses key concepts like the state space representation, initial and goal states, actions/operators that transform states, and different search strategies. Specific examples covered include the vacuums world problem, towers of Hanoi, water jugs problem, and the 8 queens puzzle. The document also introduces production systems and how they can be used to represent state space search problems.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
Thinking of getting a dog? Be aware that breeds like Pit Bulls, Rottweilers, and German Shepherds can be loyal and dangerous. Proper training and socialization are crucial to preventing aggressive behaviors. Ensure safety by understanding their needs and always supervising interactions. Stay safe, and enjoy your furry friends!
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
2. Objective
This lecture will enable you to
• Represent simple facts in the language of
propositional logic
• Interpret a propositional logic statement
• Compute the meaning of a compound
proposition
3. Knowledge Representation
• Knowledge is a theoretical or practical
understanding of a subject or a domain.
Knowledge is also the sum of what is currently
known.
• Knowledge is “the sum of what is known: the
body of truth, information, and principles
acquired by mankind.” Or, "Knowledge is what I
know, Information is what we know."
• There are many other definitions such as:
4. Definition of Knowledge
• Knowledge is "information combined with experience,
context, interpretation, and reflection. It is a high-value
form of information that is ready to apply to decisions
and actions." (T. Davenport et al., 1998)
• Knowledge is “human expertise stored in a person’s
mind, gained through experience, and interaction with
the person’s environment." (Sunasee and Sewery,
2002)
• Knowledge is “information evaluated and organized by
the human mind so that it can be used purposefully,
e.g., conclusions or explanations." (Rousa, 2002)
5. Knowledge Representation
• Knowledge consists of information that has
been:
– interpreted,
– categorised,
– applied, experienced and revised.
• In general, knowledge is more than just data,
it consist of: facts, ideas, beliefs, heuristics,
associations, rules, abstractions, relationships,
customs.
6. Research literature classifies
knowledge as
Classification-based
Knowledge
Ability to classify information
Decision-oriented
Knowledge
Choosing the best option
Descriptive
knowledge
State of some world (heuristic)
Procedural knowledge How to do something
Reasoning knowledge
What conclusion is valid in what
situation?
Assimilative
What its impact is?
7. Knowledge Representation
• Knowledge representation (KR) is the study of how
knowledge about the world can be represented and
what kinds of reasoning can be done with that
knowledge. Knowledge Representation is the method
used to encode knowledge in Intelligent Systems.
• Since knowledge is used to achieve intelligent behavior,
the fundamental goal of knowledge representation is
to represent knowledge in a manner as to facilitate
inferencing (i.e. drawing conclusions) from knowledge.
A successful representation of some knowledge must,
then, be in a form that is understandable by humans,
and must cause the system using the knowledge to
behave as if it knows it.
8. • Some issues that arise in knowledge
representation from an AI perspective are:
– How do people represent knowledge?
– What is the nature of knowledge and how do we
represent it?
– Should a representation scheme deal with a particular
domain or should it be general purpose?
– How expressive is a representation scheme or formal
language?
– Should the scheme be declarative or procedural ?
10. Properties for Knowledge
Representation Systems
• Representational Adequacy
– the ability to represent the required knowledge;
• Inferential Adequacy
– the ability to manipulate the knowledge represented to produce
new knowledge corresponding to that inferred from the
original;
• Inferential Efficiency
– the ability to direct the inferential mechanisms into the most
productive directions by storing appropriate guides;
•
• Acquisitional Efficiency
– the ability to acquire new knowledge using automatic methods
wherever possible rather than reliance on human intervention.
12. How can we represent knowledge in a
machine ?
• We need a language to represent domain
knowledge
• There must be a method to use this
knowledge
• Inference Mechanism
• Syntax and Semantics of a language
– Laughs (Suman) == ??
– Likes ( Sunita, Shanta) == ??
13. Logic is a Formal language
• Propositional Logic
– Serena is Intelligent Propositional
– Serena is hardworking
– If Serena is Intelligent and Serena is hardworking
Then Serena scores high marks
14. Elements of Propositional Logic
• Anil is Intelligent
• Anil is hardworking
• Objects and Relations or Functions
Anil Intelligent
hardworking
15. Intelligent (Anil) == Anil is intelligent
Hardworking (Anil) == Anil is hardworking
Propositions
Also Intelligent-Anil can be a proposition
A Propositions (statement) can be True or False
16. Towards the Syntax
• Let P stand for intelligent (Anil)
• Let Q stand for Hardworking (Anil)
• What does P Λ Q ( P and Q) mean ?
• What does P V Q ( P or Q) mean ?
• P Λ Q, P V Q are compound propositions
17. Syntactic Elements of Propositional
Logic
• Vocabulary
– A set of propositional symbols (P , Q, R etc.) each
of which can be True or False
– Set of logical operators
Λ (AND), V(OR), ¬ (NOT), (IMPLIES)
Often parenthesis () is used for grouping
– There are two special symbols
TRUE (T) and FALSE(F) – these are logical
constants
18. How to form propositional sentences ?
• Each symbol ( a proposition or a constant) is a
sentence
• If P is a sentence and Q is a sentence
• Then
– (P) is a sentence
– P Λ Q is a sentence
– P V Q is a sentence
– ¬P is a sentence
– P Q is a sentence
– Nothing else is a sentence
Sentences are also
called well formed
formulae (wff)
19. Example wffs
• P
• True
• P Λ Q
• (P V Q) R
• (P Λ Q) V R S
• ¬ (P V Q)
• ¬ (P V Q) R Λ S
20. Implies
• P Q
• If P is true then Q is true
• If it rains then the roads are wet
• What about –
• If the roads are wet then it rains ????
21. Equivalence
• P Q
• Example ?
• If two sides of a triangle are equal then two
base angles of the triangle are equal.
• Can be represented as two sentences
• P Q and Q P
22. What does a wff mean -- Semantics
• Interpretation in a world
• When we interpret a sentence in a world we
assign meaning to it and it evaluates to either
True or False
23. P The child can write Nursery
Class II
F
T
The child can speak
T
T
24. P Suman is intelligent
Q Serena is Diligent
Class II
Class III
T
T
F
F
25. So how do we get the meaning ?
• Remember: Sentences can be compound
propositions
• Interpret each atomic proposition in the same
world
• Assign Truth values to each interpretation
• Compute the truth value of the compound
proposition
26. Example
• P: likes (Suman, Shanta)
• Q: knows (Subash, Shraddha)
• World: Suman and Shanta are friends and Subash
and Shraddha are known to each other
• P = T, Q = T
• P Λ Q = T
• P Λ (¬ Q) = F
27. Validity of a sentence
• If a propositional sentence is true under all
possible interpretation, it is VALID
• Tautology
– P V ¬ P is always true
28. Quiz
Express the following English Statements in the
language of propositional logic:
1. It rains in July
2. The book is not costly
3. If it rains today and one does not carry
umbrella he will be drenched
29. Quiz
• If P is true and Q is true, then are the
following true or false ?
1. P Q
2. (¬ P V Q) Q
3. (¬ P V Q) P
4. P V ¬ P T
31. Quiz Answer
It rains in July
Rains (July)
¬ Rains (November) etc.
If it rains today and one does not carry umbrella
he will be drenched
• Rains (Today) Λ ¬ Carry (Umbrella) Get_Drenched
• Rains (Today) Λ ¬ Carry (Umbrella, Tom) Get_Drenched (Tom)
32. Objective
• In this lecture you will learn to:
• Infer the truth value of a proposition
• Reason towards new facts, given a set of
propositions
• Prove a proposition given a set of
propositional facts
33. Procedure to derive Truth Value
Truth Table
P Q P Λ Q
F F F
F T F
T F F
T T T
P Q P V Q
F F F
F T T
T F T
T T T
P Q P Q
F F T
F T T
T F F
T T T
34. ¬PVQ P Λ Q
P Q ¬P ¬PVQ P Λ Q
F F T T F F
F T T T F F
T F F F F T
T T F T T T
35. De Morgan’s Theorem
¬(P Λ Q) =¬P V ¬Q
¬(P V Q) =¬P Λ ¬Q
P Q P V Q ¬(P V Q) ¬P ¬Q ¬P Λ ¬Q
F F F T T T T
F T T F T F F
T F T F F T F
T T T F F F F
36. Retry the Earlier Quiz
• If P is true and Q is true, then are the
following true or false ?
1. P Q P Q P Q
F F T
F T T
T F F
T T T
37. Retry the Earlier Quiz
3. (¬ P V Q) P: S is the entire statement
P ¬ P Q S
T F T T
38. Reasoning
P: It is the month of July
Q: It rains
R: P Q
If it is the month of July then it rains
It is the month of July
Conclude: It rains
40. Is Modus Ponens Correct Inference
Rule ??
P Q can be written as ¬ P V Q conjoined
with P we can write
P Λ ¬ P V Q = (P Λ ¬ P) V Q
= F V Q
= Q
41. Modus Ponens
• Thus irrespective of meaning Modus Ponens
allows us to infer the truth of Q
• Modus Ponens is an inference rule that allows
us to deduce the truth of a consequent
depending on the truth of the antecedents
42. Inference rule can be mechanically applied
Other rules
If P and Q then P
If P then P or Q
If Not(Not(P)) then P
Chain Rule:
If P then Q If Q then R
Leads to If P then R
43. Satisfiability
• Remember: An interpretation is a mapping to a
world
• A sentence is SATISFIABLE by an interpretation
If
Under that interpretation the sentence evaluates to
TRUE
If No interpretation makes all the sentences in the
set to be TRUE
Then the set of sentences is UNSATISFIABLE or
INCONSISTENT
44. Entailment
Interpretation that satisfies the larger world set of sentences S and same interpretation makes the
candidate sentence TRUE then we can say the S logically entails the candidate sentence.
45. • If a sentence s1 has a value True for all
interpretations that make all sentences a set
of sentence S then
S |- s1
S1 logically follows from S
S1 is a logical consequence of S
S logically entails s1
46.
47. Clause: A Special Form
• Literal – A single proposition or its negation
P, ¬ P
• A clause is a disjunction of literals
P V Q V ¬ R
48. Converting a compound proposition to
the clausal form
• Consider the sentence (wff)
¬ (A B) V ( C A)
1. Eliminate implication sign
¬ (¬ A V B) V (¬ C V A)
2. Eliminate double negation and replace scope
of “not” signs (De - Morgans Law)
(A Λ ¬ B) V (¬ C V A)
49. Converting a compound proposition to
the clausal form
3. Convert to conjunctive normal form by using
distributive and associative laws
(A Λ ¬ B) V (¬ C V A)
(A V ¬ C V A) Λ (¬ B V ¬ C V A)
(A V ¬ C) Λ (¬ B V ¬ C V A)
4. Get the set of clauses
Set of clauses
(A V ¬ C)
(¬ B V ¬ C V A)
50. Resolution – A technique of Inference
• A sound inference mechanism
Principle:
Suppose x is a literal and s1 and s2 are two sets of
propositional sentences represented in clausal form
If we have (x V s1) Λ (¬x V s2)
Then we get S1 V S2
Here S1 V S2 is the resolvent,
x is resolved upon
51. An Example
• If a triangle is equilateral then it is isosceles
• If a triangle is isosceles then two sides AB and
AC are equal
• If AB and AC are equal then angle B and angle
C are equal
• ABS is an equilateral triangle
• Angle B is equal to angle C – Prove
52. Proving by Resolution
• If a triangle is equilateral then it is isosceles
Equilateral (ABC) Isosceles (ABC)
• If a triangle is isosceles then two sides AB and AC
are equal
Isosceles (ABC) Equal (AB, AC)
• If AB and AC are equal then angle B and angle C
are equal
Equal (AB, AC) Equal (B, C)
• ABC is an equilateral triangle
Equilateral (ABC)
53. Proving by Resolution
• Clausal Form
1. Equilateral (ABC) Isosceles (ABC)
¬ Equilateral (ABC) V Isosceles (ABC)
2. Isosceles (ABC) Equal (AB, AC)
¬Isosceles (ABC) V Equal (AB, AC)
3. Equal (AB, AC) Equal (B, C)
¬Equal (AB, AC) V Equal (B, C)
4. Equilateral (ABC)
54. Proof by refutation
• To prove
Angle B is equal to Angle C
Equal (B,C)
Let us disprove
Not Equal (B,C)
¬ Equal (B,C)
Let us try to disprove this
55. ¬ Equilateral
(ABC) V
Isosceles (ABC)
¬Isosceles (ABC)
V Equal (AB,
AC)
¬Equal (AB, AC)
V Equal (B, C)
Equilateral
(ABC)
¬ Equal (B,C)
¬Equal (AB, AC) V Equal (B, C)
¬Equal (AB, AC)
¬Isosceles (ABC)
V Equal (AB, AC)
¬Isosceles (ABC)
¬ Equilateral (ABC)
V Isosceles (ABC)
¬ Equilateral (ABC)
Equilateral (ABC) Null Clause
56. Procedure for Resolution
• Convert given propositions into clausal form
• Convert the negation of the sentence to be
proved into clausal form
• Combine the clauses into a set
• Iteratively apply resolution to the set and add the
resolvent to the set
• Continue until no further resolvents can be
obtained or a null clause is obtained
57. Quiz
• Consider the following sentences
1. Mammals drink milk
2. Man is mortal
3. Man is a mammal
4. Tom is a man
5. Prove Tom drink(s) milk
6. Prove Tom is mortal
58. Quiz
1. Represent all the sentences in clausal form
2. Prove (5) and (6) using modus ponens
3. Prove (5) and (6) using resolution
60. Quiz Answers
1. Mammals drink milk
2. Man is mortal
3. Man is a mammal
4. Tom is a man
1. Prove Tom drink(s) milk
2. Prove Tom is mortal
61.
62. • Applying modus ponens on R and S we get
mammal(Tom) ……… 5
• Applying modus ponens on 5 and 1 we get
drink(Tom, Milk)
63. • Applying modus ponens on R and S we get
mammal(Tom) ……… 5
• Applying modus ponens on 5 and 1 we get
drink(Tom, Milk)
64. Proof by Resolution
P: mammal (Tom) drink (Tom, Milk)
¬ mammal (Tom) V drink (Tom, Milk)
Q: man (Tom) mortal (Tom)
¬ man (Tom) V mortal (Tom)
R: man(Tom) mammal (Tom)
¬man(Tom) V mammal (Tom)
S: man(Tom)
Goal: drink(Tom, Milk)
To disprove: ¬ drink(Tom, Milk)
65. P: ¬ mammal (Tom)
V drink (Tom, Milk)
Q:¬ man (Tom) V
mortal (Tom)
R: ¬man(Tom) V
mammal (Tom)
S: man(Tom)
¬ drink(Tom, Milk)
¬ mammal (Tom)
V drink (Tom, Milk)
¬ mammal (Tom)
¬man(Tom) V
mammal (Tom)
¬man(Tom)
man(Tom)
Null Clause
66. Objective
This lecture will enable you to
• Formulate more types of sentences in logic
• Write correct predicate logic formulae
67. Limitation of Propositional Logic
• Consider the following argument
– All dogs are faithful
– Tommy is a dog
– Therefore, Tommy is faithful
• How to represent and infer this in
propositional logic?
68. Limitation of Propositional Logic
p : all dogs are faithful
q: Tommy is a dog
But P Λ q === >Tommy is faithful ??
No! we cannot infer this in Propositional Logic
69. More Scenarios
• Tom is a hardworking student
Hardworking(Tom)
• Tom is an intelligent student
Intelligent(Tom)
• If Tom is hardworking and Tom is intelligent
Then Tom scores high marks
• Hardworking(Tom) Λ Intelligent (Tom)
Scores_High_Marks(Tom)
70. What about John and Jill ?
If we could write instead
All students who are hardworking and intelligent
scores high marks !!!
For all x such that x is a student and x is intelligent
and x is hardworking then x scores high marks
71. The Problem of Infinite Model
• In general, propositional logic can deal with
only a finite, number of propositions.
• If there are only three dogs Tommy, Jimmy,
and Laika, Then
T: Tommy is faithful
J: Jimmy is faithful
L: Laika is faithful
All dogs are faithful T Λ J Λ L
• What if there are infinite number of dogs ?
72. First-Order Logic / Predicate Logic
• First – order logic or predicate logic is a
generalization of propositional logic that
allows us to express and infer arguments in
infinite modes like
– All men are mortal
– Some birds cannot fly
– At least one planet has life on it
73. Syntax of FOL
• The syntax of first-order logic can be defined
in terms of
–Terms
–Predicate
–Quantifiers
74. Term
• A term denotes some object other than True
or False
Tommy is a dog
Tommy = Term
All men are mortal
Men = Term
75. Terms: Constant & Variable
• A constant of type W is a name that denotes a
particular object in a set W
– Example: 5, Tommy etc.
• A variable of type W is a name that can
denote any element in the set W
– Example: x ∈ N denotes a natural number
d denotes the name of a dog
76. Terms: Functions
• A functional term of arity n takes n objects of
type W 1 to W n as inputs and returns an
object of type W.
f(W 1, W2, …, W n )
plus(3,4) = 7
Functional term Constant term
77. Functions: Example
• Let plus be a function that takes two
arguments of type Natural Number and
returns a Natural number
• Valid functional terms:
plus (2, 3) plus(5, plus(7,3))
plus(plus(100,plus(1,6)),plus(3,3))
• Invalid functional terms:
plus(0, -1) plus(1.2, 3.1)
79. Predicates
• Predicates are like functions except that their
return type is true or false.
• Example:
– gt ( x , y ) is true iff x > y
– Here gt is a predicate symbol that takes two
arguments of type natural number
– gt (3, 4 ) is a valid predicate but gt (3, -4) is not
80. Types of Predicates
• A predicate with no variable is a proposition
– Tommy is a dog
• A predicate with one variable is called a
property
– dog(x) is true iff x is a dog.
– mortal(y) is true iff y is mortal.
81. Formulation of Predicates
• Let P( x , y , …) and Q( x , y , …) are two
predicates.
• Then so are
P V Q
P Λ Q
¬ P
P Q
82. Predicate Examples
• If x is a man then x is mortal
man (x) mortal (x)
¬ man (x) V mortal (x)
• If n is a natural number, then n is either even
or odd.
natural (n) even (n) V odd (n)
83. Quantifiers
• There are two basic quantifiers in FOL
– ∀ “For all” – Universal quantifier
– ∃ “There exists” – Existential quantifier
84. Universal quantifier
• ∀x P(x) : P(x) is true for any element that I
choose from the set
• This predicate will be true if it satisfy all value
of x satisfy P(x)
86. Existential quantifier
• ∃x holiday (x)
• This predicate will be true if it satisfy at least
one x
• Days of week {Saturday, Sunday, Monday,
Tuesday, Wednesday, Thursday, Friday}
87. Universal Quantifiers
• All dogs are faithful
– faithful(x) : x is faithful
– dog(x): x is a dog
– ∀x (dog (x) faithful (x))
• All birds cannot fly
– fly(x): x can fly
– bird(x): x is a bird
– ¬ (∀x(bird (x) fly (x)))
89. Existential Quantifiers
• At least one planet has life on it
– Planet(x): x is a planet
– haslife(x): x has life on it
– ∃x (planet (x) Λ haslife(x))
90. Existential Quantifiers
• All birds cannot fly There exists a bird that
cannot fly
• Fly (x): x can fly
• Bird (x): x is a bird
• ∃x (Bird (x) Λ ¬ fly (x))
91. Duality of Quantifiers
All men are mortal
No man is immortal
There exist birds that can fly.
It is not the case that all birds cannot fly.
92. Sentences
• A predicate is a sentence
• If sen, ¬ sen are sentences and x a variable,
then
• (sen), ¬sen, ∃x sen, ∀x sen, sen Λ ¬ sen , sen V
¬ sen, sen ¬ sen are sentences
• Nothing else is a sentence
93. Quiz / Exercises
• Some dogs bark
• All dogs have four legs
• All barking dogs are irritating
• No dugs purr
• Father are male parents with children
• Students are people who are enrolled in
courses
94. Examples of Sentences
Birthday ( x , y ) – x celebrates birthday on date y
∀ y ∃ x Birthday ( x , y ) –
For all dates, there exists a person who
celebrates his/her Birthday on that date.
That is - “everyday someone celebrates
his/her birthday”
95. Examples
Brother ( x , y ) – x is y’s brother
Loves ( x , y ) – x loves y
∀ x ∀ y Brother ( x , y) Loves ( x , y)
Everyone loves (all of) his/her brothers.
Let m(x) represent mother of x then
“everyone loves his/her mother” is
∀ x Loves (x, m(x))
96. Examples
• Any number is the successor of its
predecessor
• succ (x), pred (x)
• Equal ( x , y )
∀ x equal ( x , succ (pred (x))
97. Alternative Representation
• The previous example can be represented
succinctly as
• ∀ x (succ (pred (x) = x))
• Not Allowed in predicates
98. FOL with Equality
• In FOL with equality, we are allowed to use
the equality sign (=) between two functions.
• This is just for representational ease.
• We modify the definition of sentence to
include equality as
term = term is also a sentence
99. Quiz Revisited
• Some dogs bark
• ∃ x (dog(x) Λ bark(x))
• All dogs have four legs
• ∀ x (dog(x) have_four_legs(x))
• ∀ x (dog(x) legs(x,4))
• No dugs purr
• ¬ ∃ x (dog(x) Λ purr(x))
100. • Father are male parents with children
• ∀ x (father(x) male(x) Λ has_childern(x))
101. Inference Rule
• Universal Elimination
∀ x Likes(x, flower)
Substituting x by Shirin gives
Likes(Shirin , flower)
The substitution should be done by a constant
term
102. Inference Rule
• Existential Elimination (Skolemization)
∃ x Likes(x, flower)
Likes(Person, flower)
As long as person is not in the knowledge base
• Existential introduction
Likes (shahid, flower)
Can be written as
∃ x Likes(x, flower)
103. Reasoning in FOL
• Consider the following problem:
If a perfect square is divisible by a prime p,
then it is also divisible by square of p. Every
perfect square is divisible by some prime.
36 is a perfect square.
Does there exist a prime q such that square of
q divided 36?
104. Representation in FOL
• If a perfect square is divisible by a prime p,
then it is also divisible by square of p.
∀ x ,y ( perfect_sq(x) Λ prime(y) Λ divides(x, y)
divides (x, square(y))
• Every perfect square is divisible by some
prime.
∀ x ∃ y (perfect_sq(x) Λ prime(y) Λ divides(x,
y)
105. Representation in FOL
• 36 is a perfect square.
perfect_sq(36)
• Does there exist a prime q such that the
square of q divides 35 ?
∃ y (prime(y) Λ divides(36, square(y))
106. The knowledge base
1. ∀ x ,y ( perfect_sq(x) Λ prime(y) Λ divides(x,
y) divides (x, square(y))
2. ∀ x ∃ y (perfect_sq(x) Λ prime(y) Λ divides(x,
y)
3. perfect_sq(36)
107. Inference
• From 2 and Universal Elimination
(4) ∃ y (perfect_sq(36) Λ prime (y) Λ
divides(36, y))
• From 4 and Existential Elimination
(5) perfect_sq(36) Λ prime (P) Λ divides(36, P)
• From (1) and (5)
(6) divides (36, square(P))
108. Inference
• From (5) and (6)
(7) prime (P) Λ divides(36, square(P))
• From (7) and Existential Introduction
∃ y prime (y) Λ divides(36, square(y))
109. Horn Sentences
• Atomic sentence
perfect_sq(36)
• Implication with a conjunction of atomic
sentences on the left and a single atom on the
right
∀ x ,y ( perfect_sq(x) Λ prime(y) Λ divides(x, y)
divides (x, square(y))
• No existential Quantifier
110. Conversion to Horn Sentences
• Existential Quantifiers can be removed using
Existential Elimination (Skolemization)
– If the existential quantifier is outside any universal
quantifier, a Skolem constant is introduced. E.g. ∃
y prime (y) can be written as prime(P), where P is
a Skolem constant
– Otherwise a Skolem function is introduced. E.g.
– ∀ x ∃ y ( prime(y) Λ divides(x, y) ∀ x
prime(PD(x)) Λ divides(x, PD(x)), where PD(x) is a
Skolem function
111. Conversion to Horn Sentences
• And- Elimination
• Prime(P) Λ divides( x , P) can be written as two
clauses
Prime(P)
divides ( x , P )
112. Substitution
• It replaces variables with constants.
• SUBST({x/49, y/7}, Divides( x, y)) = Divides (49;
7)
113. Unification
• It is the process of finding a substitution that
makes two atomic sentences identical.
• UNIFY(Prime(7), Prime(x)) = {x/7}
116. An Example
• If a triangle is equilateral then it is isosceles
• If a triangle is isosceles then two sides AB and
AC are equal
• If AB and AC are equal then angle B and angle
C are equal
• ABS is an equilateral triangle
• Angle B is equal to angle C – Prove
117. Proving by Resolution
• If a triangle is equilateral then it is isosceles
Equilateral (ABC) Isosceles (ABC)
• If a triangle is isosceles then two sides AB and AC
are equal
Isosceles (ABC) Equal (AB, AC)
• If AB and AC are equal then angle B and angle C
are equal
Equal (AB, AC) Equal (B, C)
• ABC is an equilateral triangle
Equilateral (ABC)
118. Proving by Resolution
• Clausal Form
1. Equilateral (ABC) Isosceles (ABC)
¬ Equilateral (ABC) V Isosceles (ABC)
2. Isosceles (ABC) Equal (AB, AC)
¬Isosceles (ABC) V Equal (AB, AC)
3. Equal (AB, AC) Equal (B, C)
¬Equal (AB, AC) V Equal (B, C)
4. Equilateral (ABC)
119. Proof by refutation
• To prove
Angle B is equal to Angle C
Equal (B,C)
Let us disprove
Not Equal (B,C)
¬ Equal (B,C)
Let us try to disprove this
120. ¬ Equal (B,C)
¬Equal (AB, AC) V Equal (B, C)
¬Equal (AB, AC)
¬Isosceles (ABC)
V Equal (AB, AC)
¬Isosceles (ABC)
¬ Equilateral (ABC)
V Isosceles (ABC)
¬ Equilateral (ABC)
Equilateral (ABC) Null Clause
¬ Equilateral
(ABC) V Isosceles
(ABC)
¬Isosceles (ABC)
V Equal (AB, AC)
¬Equal (AB, AC) V
Equal (B, C)
Equilateral (ABC)
121. Procedure for Resolution
• Convert given propositions into clausal form
• Convert the negation of the sentence to be proved into
clausal form
• Combine the clauses into a set
• Iteratively apply resolution to the set and add the resolvent
to the set
• Continue until no further resolvents can be obtained or a
null clause is obtained
122. A Few Statements
• All people who are graduating are happy.
• All happy people smile.
• Someone is graduating.
• Is someone smiling?
(Conclusion)
123. Solving the problem
• We intend to code the problem in predicate
calculus.
• Use resolution refutation to solve problem
• Solving = whether the conclusion can be
answered from the given set of sentences.
124. Selecting the Predicates
• Graduating(x): x is graduating
• Happy(x): x is happy
• Smiling(x): x is smiling
125. Encoding sentences in Predicate Logic
• All people who are graduating are happy
– ∀ x [graduating(x) happy(x)]
• All happy people smile
– ∀ x[happy(x) smiling(x)]
• Someone is graduating
– ∃ x graduating(x)
• Is someone smiling
– ∃ x smiling(x)
126. Predicates
1.∀ x [graduating(x) happy(x)]
2. ∀ x[happy(x) smiling(x)]
3. ∃ x graduating(x)
4. ¬ ∃ x smiling(x)
(Negating the conclusion)
127. Converting to Clausal Form
• Step 1: Eliminate
1. ∀ x ¬graduating(x) V happy(x)
2. ∀ x ¬ happy(x) V smiling(x)
3. ∃ x graduating(x)
4. ¬ ∃ x smiling(x)
128. Converting to Canonical / Normal
Form
• Step 2: Reduce the scope of negation
1. ∀ x ¬graduating(x) V happy(x)
2. ∀ x ¬ happy(x) V smiling(x)
3. ∃ x graduating(x)
4. ∀ x ¬ smiling(x)
129. Converting to Canonical / Normal
Form
• Step 3: Standardize variables apart
1. ∀ x ¬graduating(x) V happy(x)
2. ∀ y ¬ happy(y) V smiling(y)
3. ∃ z graduating(z)
4. ∀w ¬ smiling(w)
130. Converting to Canonical / Normal
Form
• Step 4: Move all quantifiers to the left
1. ∀ x ¬graduating(x) V happy(x)
2. ∀ y ¬ happy(y) V smiling(y)
3. ∃ z graduating(z)
4. ∀w ¬ smiling(w)
131. Converting to Canonical / Normal
Form
• Step 5: Eliminate ∃ (Skolemization)
1. ∀ x ¬graduating(x) V happy(x)
2. ∀ y ¬ happy(y) V smiling(y)
3. graduating(name1)
(name1 is the Skolemization constant)
4. ∀w ¬ smiling(w)
132. Converting to Canonical / Normal
Form
• Step 6:Drop all ∀
1. ¬graduating(x) V happy(x)
2. ¬ happy(y) V smiling(y)
3. graduating(name1)
(name1 is the Skolemization constant)
4. ¬ smiling(w)
134. Converting to Canonical Form
• Step 7: Convert to conjunct of disjunction
form
• Step 8: Make each conjunct a separate clause.
• Step 9: Standardize variables apart again.
• These steps do not change the set of clauses
any further (in the present problem)
136. Quiz
• Solve the problem with resolution
If a perfect square is divisible by a prime p,
then it is also divisible by square of p. Every
perfect square is divisible by some prime.
36 is a perfect square.
Does there exist a prime q such that square of
q divided 36?
138. Find the Package Example (Nilsson)
• We know that
• All packages in room 27 are smaller than those
in room 28
• Package A is either in room 27 or in room 28
• Package B is in room 27
• Package B is not smaller than Package A
• Where is Package A ?