The document discusses first-order logic. It begins by providing an example of expressing statements about Socrates being human and humans being mortal in first-order logic using predicates, variables, quantifiers and constants. It then defines the key components of first-order logic syntax, including predicates, functions, variables, terms, and well-formed formulas. It also discusses semantics, including vocabularies, structures, interpretations, and how structures can satisfy formulas. Finally, it provides an example of applying first-order logic to represent an artificial intelligence problem involving an agent navigating a grid to find gold while avoiding dangers.
Annotating Rhetorical and Argumentative Structures in Mathematical KnowledgeChristoph Lange
This document summarizes the work Christoph Lange did at DERI from April to October 2008. It discusses Lange's background in mathematical knowledge management and his project using semantic web technologies like ontologies and annotation. At DERI, Lange learned about engineering ontologies for scientific documents and user interfaces for annotating and browsing knowledge. He expanded his ontologies to model the rhetorical and document structures of mathematical texts.
Analogy is one of the most studied representatives of a family of non-classical forms of reasoning working across different domains, usually taken to play a crucial role in creative thought and problem-solving. In the first part of the talk, I will shortly introduce general principles of computational analogy models (relying on a generalization-based approach to analogy-making). We will then have a closer look at Heuristic-Driven Theory Projection (HDTP) as an example for a theoretical framework and implemented system: HDTP computes analogical relations and inferences for domains which are represented using many-sorted first-order logic languages, applying a restricted form of higher-order anti-unification for finding shared structural elements common to both domains. The presentation of the framework will be followed by a few reflections on the "cognitive plausibility" of the approach motivated by theoretical complexity and tractability considerations.
In the second part of the talk I will discuss an application of HDTP to modeling essential parts of concept blending processes as current "hot topic" in Cognitive Science. Here, I will sketch an analogy-inspired formal account of concept blending —developed in the European FP7-funded Concept Invention Theory (COINVENT) project— combining HDTP with mechanisms from Case-Based Reasoning.
Machine Learning Preliminaries and Math Refresherbutest
The document is an introduction to machine learning preliminaries and mathematics. It covers general remarks about learning as a process of model building, an overview of key concepts from probability theory and statistics needed for machine learning like random variables, distributions, and expectations. It also introduces linear spaces and vector spaces as mathematical structures that are important foundations for machine learning algorithms. The goal is to cover essential mathematical concepts like probability, statistics, and linear algebra that are prerequisites for machine learning.
This document discusses the computation of presuppositions and entailments from natural language text. It begins by defining presuppositions and entailments, and explaining how they can be computed using tree transformations on semantic representations. The paper then provides examples of elementary presuppositions and entailments. It describes a system that computes presuppositions and entailments while parsing sentences using an augmented transition network. The system applies tree transformations specified in the lexicon to the semantic representation to derive inferences. The paper concludes that presuppositions and entailments exhibit computational properties not shown by the general class of inferences, such as being tied to the semantic and syntactic structure of language.
This document introduces predicate logic, including predicate symbols and signatures, logical connectives like negation and conjunction, quantifiers like universal and existential, the syntax and semantics of predicate logic formulas, and some useful equivalences. Predicate logic allows representing relations between objects using predicates of varying arities and expressing properties of collections using quantifiers. Formulas in predicate logic can be built from predicate and function symbols, terms, quantifiers, and logical connectives.
The document discusses models and formalisms in logic. It introduces formal systems as consisting of an axiomatic theory, symbols for constructing formulas, grammar rules, axioms, inference rules, and properties like consistency. Propositional and predicate logic are examined, including their model theories, axiomatic theories, properties like completeness and soundness, and resolution principles. Normal forms and the resolution rule are defined as ways to deduce theorems in a formal system.
The document discusses formal systems and logic. It begins by introducing formal systems, which consist of an axiomatic theory, symbols for constructing formulas, grammar rules, axioms, inference rules, and theorems deduced from axioms and rules. Propositional and predicate logic are then covered, including their languages, semantics using interpretation functions, axiomatic theories, and properties like soundness and completeness. Resolution principles like normal forms and the resolution rule are also summarized. The document provides examples and explanations throughout.
This document discusses modal logics and formalisms. It defines modal logics as logics that add new logical constants like necessity (□) and possibility (◇) to classical logic. It describes how modal logics can be classified based on whether they are extended logics that add new well-formed formulas or deviant logics that interpret the usual logical constants differently. The document then focuses on modal logics, defining their language and providing details on their model theory using possible world semantics. It discusses truth in possible worlds and models. It also describes several axiomatic modal systems and the relationships between them, and examines the classes of models validated by different axioms.
Annotating Rhetorical and Argumentative Structures in Mathematical KnowledgeChristoph Lange
This document summarizes the work Christoph Lange did at DERI from April to October 2008. It discusses Lange's background in mathematical knowledge management and his project using semantic web technologies like ontologies and annotation. At DERI, Lange learned about engineering ontologies for scientific documents and user interfaces for annotating and browsing knowledge. He expanded his ontologies to model the rhetorical and document structures of mathematical texts.
Analogy is one of the most studied representatives of a family of non-classical forms of reasoning working across different domains, usually taken to play a crucial role in creative thought and problem-solving. In the first part of the talk, I will shortly introduce general principles of computational analogy models (relying on a generalization-based approach to analogy-making). We will then have a closer look at Heuristic-Driven Theory Projection (HDTP) as an example for a theoretical framework and implemented system: HDTP computes analogical relations and inferences for domains which are represented using many-sorted first-order logic languages, applying a restricted form of higher-order anti-unification for finding shared structural elements common to both domains. The presentation of the framework will be followed by a few reflections on the "cognitive plausibility" of the approach motivated by theoretical complexity and tractability considerations.
In the second part of the talk I will discuss an application of HDTP to modeling essential parts of concept blending processes as current "hot topic" in Cognitive Science. Here, I will sketch an analogy-inspired formal account of concept blending —developed in the European FP7-funded Concept Invention Theory (COINVENT) project— combining HDTP with mechanisms from Case-Based Reasoning.
Machine Learning Preliminaries and Math Refresherbutest
The document is an introduction to machine learning preliminaries and mathematics. It covers general remarks about learning as a process of model building, an overview of key concepts from probability theory and statistics needed for machine learning like random variables, distributions, and expectations. It also introduces linear spaces and vector spaces as mathematical structures that are important foundations for machine learning algorithms. The goal is to cover essential mathematical concepts like probability, statistics, and linear algebra that are prerequisites for machine learning.
This document discusses the computation of presuppositions and entailments from natural language text. It begins by defining presuppositions and entailments, and explaining how they can be computed using tree transformations on semantic representations. The paper then provides examples of elementary presuppositions and entailments. It describes a system that computes presuppositions and entailments while parsing sentences using an augmented transition network. The system applies tree transformations specified in the lexicon to the semantic representation to derive inferences. The paper concludes that presuppositions and entailments exhibit computational properties not shown by the general class of inferences, such as being tied to the semantic and syntactic structure of language.
This document introduces predicate logic, including predicate symbols and signatures, logical connectives like negation and conjunction, quantifiers like universal and existential, the syntax and semantics of predicate logic formulas, and some useful equivalences. Predicate logic allows representing relations between objects using predicates of varying arities and expressing properties of collections using quantifiers. Formulas in predicate logic can be built from predicate and function symbols, terms, quantifiers, and logical connectives.
The document discusses models and formalisms in logic. It introduces formal systems as consisting of an axiomatic theory, symbols for constructing formulas, grammar rules, axioms, inference rules, and properties like consistency. Propositional and predicate logic are examined, including their model theories, axiomatic theories, properties like completeness and soundness, and resolution principles. Normal forms and the resolution rule are defined as ways to deduce theorems in a formal system.
The document discusses formal systems and logic. It begins by introducing formal systems, which consist of an axiomatic theory, symbols for constructing formulas, grammar rules, axioms, inference rules, and theorems deduced from axioms and rules. Propositional and predicate logic are then covered, including their languages, semantics using interpretation functions, axiomatic theories, and properties like soundness and completeness. Resolution principles like normal forms and the resolution rule are also summarized. The document provides examples and explanations throughout.
This document discusses modal logics and formalisms. It defines modal logics as logics that add new logical constants like necessity (□) and possibility (◇) to classical logic. It describes how modal logics can be classified based on whether they are extended logics that add new well-formed formulas or deviant logics that interpret the usual logical constants differently. The document then focuses on modal logics, defining their language and providing details on their model theory using possible world semantics. It discusses truth in possible worlds and models. It also describes several axiomatic modal systems and the relationships between them, and examines the classes of models validated by different axioms.
This document discusses modal logics and formalisms. It begins by defining classical and non-classical logics, with modal logics listed as an example of an extended logic. It then covers modal logics in more detail, defining their language and model theory using possible world semantics. Models are defined as structures consisting of possible worlds related by an accessibility relation. Truth is evaluated at possible worlds based on this relation. The document also discusses axiomatic modal logics like KT and relations between main modal systems. Finally, it notes that axioms like D, T, B, 4 and 5 are not valid in the class of all standard models.
A Constructive Mathematics approach for NL formal grammarsFederico Gobbo
This document discusses formalizing natural language grammars through adpositional grammars (AdGrams). AdGrams are based on cognitive linguistics concepts of trajector and landmark. The document proposes that natural language structure can be expressed through a triple involving a governor, dependent, and their relation. It provides examples analyzing phrases and sentences as either dependency-based or government-based structures based on whether the dependent or governor is the trajector. The goal of AdGrams is to formalize natural language grammars in a way that is informed by cognitive linguistics concepts and can be computationally analyzed.
This document provides an introduction to the syntax of first-order logic. It begins by discussing the main objects of study in mathematical logic, such as set theory and number theory. It then defines the components of a first-order language, including logical symbols (variables, connectives, quantifiers), non-logical symbols (constants, functions, relations) and terms. Terms correspond to algebraic expressions and are formed from variables, constants and functions. Examples of languages for set theory, group theory and the theory of rings are provided.
This document provides an introduction to grammars and languages from a computer science perspective. It defines language as an infinite set of sentences, where each sentence is a finite sequence of symbols from a finite alphabet. Grammars are described as a means to generate and describe the structure of the sentences in a language. The document outlines different views of language from communication, linguistics, and computer science to establish the terminology and scope used.
An Approach to Automated Learning of Conceptual Graphs from TextFulvio Rotella
Many document collections are private and accessible only by selected people. Especially in business realities, such collections need to be managed, and the use of an external taxonomic or ontological resource would be very useful. Unfortunately, very often domain-specific resources are not available, and the development of techniques that do not rely on external resources becomes essential.
Automated learning of conceptual graphs from restricted collections needs to be robust with respect to missing or partial knowledge, that does not allow to extract a full conceptual graph and only provides sparse fragments thereof. This work proposes a way to deal with these problems applying relational clustering and generalization methods. While clustering collects similar concepts, generalization provides additional nodes that can bridge separate pieces of the graph while expressing it at a higher level of abstraction. In this process, considering relational information allows a broader perspective in the similarity assessment for clustering, and ensures more flexible and understandable descriptions of the generalized concepts. The final conceptual graph can be used for better analyzing and understanding the collection, and for performing some kind of reasoning on it.
Supermathematics and Artificial General IntelligenceJordan Bennett
In a clear way, I outline how Supermathematics may apply in Artificial General Intelligence.
I describe standard Super-Hamiltonian usage, with respect to Dwave's "Quantum Boltzmann Machine".
A talk I gave at the Yonsei University, Seoul in July 21st, 2015.
The aim was to show my background contribution to the CORCON (Correctness by Construction) research project.
I have to thank Prof. Byunghan Kim and Dr Gyesik Lee for their kind hospitality.
This document discusses the concept of an ecumenical logic system that allows both classical and intuitionistic reasoning to coexist. It summarizes Dag Prawitz's approach to defining such a system, which uses different symbols for logical constants that have different meanings classically versus intuitionistically. However, the document raises the question of why Prawitz's system only includes one symbol for negation rather than separate classical and intuitionistic negation symbols. Possible answers discussed include the interderivability of the two notions of negation and the view that negation asserts a contradiction from assuming the negated proposition. The document does not conclude there is a definitive answer and suggests this as an interesting open problem area.
This document outlines the key points that will be covered in a presentation on dialectica categories as models of linear logic. It discusses four main theorems: 1) The original dialectica category DC is a model of intuitionistic linear logic without the exponential bang (!). 2) Adding a co-free monoidal comonad makes DC a model of full intuitionistic linear logic. 3) The simplified dialectica category GC is a model of classical linear logic without bang and question mark (?). 4) Adding a complicated monoidal comonad makes GC a model of full classical linear logic. The document provides brief motivations and outlines of the constructions and proofs involved in these main results. It also discusses some applications and areas for further development
The document discusses the differences between onomasiological dictionaries and ontologies. It defines onomasiological dictionaries as grouping words by semantic fields with an entry representing a concept. Ontologies are defined as representing the entities that make up the world, with concepts and relations between concepts. Both can be represented formally, though ontologies often use more formal logic-based languages while dictionaries are based on semantic fields. The document provides examples of each and how they can be visualized differently.
Lecture 2: From Semantics To Semantic-Oriented ApplicationsMarina Santini
From the "Natural Language Processing" LinkedIn group:
John Kontos, Professor of Artificial Intelligence
I wonder whether translating into formal logic is nothing more than transliteration which simply isolates the part of the text that can be reasoned upon using the simple inference mechanism of formal logic. The real problem I think lies with the part of text that CANNOT be translated one the one hand and the one that changes its meaning due to civilization advances. My own proposal is to leave NL text alone and try building inference mechanisms for the UNTRANSLATED text depending on the task requirements.
All the best
John"
1. The document discusses the need for a positive account of informal proof in mathematics, as most mathematical proofs are informal. It argues against the view that informal proofs are recipes for formal derivations.
2. The document proposes that logic should be understood more broadly as the general study of inferential actions, as informal proofs often involve actions on mathematical objects beyond propositions. Examples of such actions include diagram manipulation in Euclidean geometry.
3. The document reviews work that may support this broader view of logic in informal proofs, such as studies of reasoning with diagrams in knot theory and using Cayley graphs to prove group theory results.
This document presents a new generic approach to pattern matching of amino acid sequences using matching policy and pattern policy. It introduces two algorithms - matching policy and pattern policy. Matching policy uses the ASCII values of patterns and text to skip comparisons and reduce complexity when the values are unequal. Pattern policy selects patterns based on descending order of amino acid occurrences in the text, determined by heap sorting. Experimental results on phosphorylated peptide datasets show the proposed approach achieves a higher success ratio than conventional methods, especially when the two policies are used together. Complexity analysis shows it has low time complexity of O(N×LS×log(LS)). The approach provides an efficient and robust way to perform pattern matching of amino acid sequences.
This document provides an introduction to classical modal logic. It discusses:
1) Classical modal logic studies formulas that are necessarily true and how necessity is preserved in reasoning. Formulas of the form φ ∨ ¬φ are always valid.
2) Modal logics take a more fine-grained view than just true/false, allowing distinctions between what is necessarily true and possibly true.
3) The document uses a puzzle about three wise men with colored hats to illustrate a logic of knowledge. It defines axioms and rules for reasoning about what individuals know, including generalization, K, T, and 4.
This document discusses modal logics and formalisms. It begins by defining classical and non-classical logics, with modal logics listed as an example of an extended logic. It then covers modal logics in more detail, defining their language and model theory using possible world semantics. Models are defined as structures consisting of possible worlds related by an accessibility relation. Truth is evaluated at possible worlds based on this relation. The document also discusses axiomatic modal logics like KT and relations between main modal systems. Finally, it notes that axioms like D, T, B, 4 and 5 are not valid in the class of all standard models.
A Constructive Mathematics approach for NL formal grammarsFederico Gobbo
This document discusses formalizing natural language grammars through adpositional grammars (AdGrams). AdGrams are based on cognitive linguistics concepts of trajector and landmark. The document proposes that natural language structure can be expressed through a triple involving a governor, dependent, and their relation. It provides examples analyzing phrases and sentences as either dependency-based or government-based structures based on whether the dependent or governor is the trajector. The goal of AdGrams is to formalize natural language grammars in a way that is informed by cognitive linguistics concepts and can be computationally analyzed.
This document provides an introduction to the syntax of first-order logic. It begins by discussing the main objects of study in mathematical logic, such as set theory and number theory. It then defines the components of a first-order language, including logical symbols (variables, connectives, quantifiers), non-logical symbols (constants, functions, relations) and terms. Terms correspond to algebraic expressions and are formed from variables, constants and functions. Examples of languages for set theory, group theory and the theory of rings are provided.
This document provides an introduction to grammars and languages from a computer science perspective. It defines language as an infinite set of sentences, where each sentence is a finite sequence of symbols from a finite alphabet. Grammars are described as a means to generate and describe the structure of the sentences in a language. The document outlines different views of language from communication, linguistics, and computer science to establish the terminology and scope used.
An Approach to Automated Learning of Conceptual Graphs from TextFulvio Rotella
Many document collections are private and accessible only by selected people. Especially in business realities, such collections need to be managed, and the use of an external taxonomic or ontological resource would be very useful. Unfortunately, very often domain-specific resources are not available, and the development of techniques that do not rely on external resources becomes essential.
Automated learning of conceptual graphs from restricted collections needs to be robust with respect to missing or partial knowledge, that does not allow to extract a full conceptual graph and only provides sparse fragments thereof. This work proposes a way to deal with these problems applying relational clustering and generalization methods. While clustering collects similar concepts, generalization provides additional nodes that can bridge separate pieces of the graph while expressing it at a higher level of abstraction. In this process, considering relational information allows a broader perspective in the similarity assessment for clustering, and ensures more flexible and understandable descriptions of the generalized concepts. The final conceptual graph can be used for better analyzing and understanding the collection, and for performing some kind of reasoning on it.
Supermathematics and Artificial General IntelligenceJordan Bennett
In a clear way, I outline how Supermathematics may apply in Artificial General Intelligence.
I describe standard Super-Hamiltonian usage, with respect to Dwave's "Quantum Boltzmann Machine".
A talk I gave at the Yonsei University, Seoul in July 21st, 2015.
The aim was to show my background contribution to the CORCON (Correctness by Construction) research project.
I have to thank Prof. Byunghan Kim and Dr Gyesik Lee for their kind hospitality.
This document discusses the concept of an ecumenical logic system that allows both classical and intuitionistic reasoning to coexist. It summarizes Dag Prawitz's approach to defining such a system, which uses different symbols for logical constants that have different meanings classically versus intuitionistically. However, the document raises the question of why Prawitz's system only includes one symbol for negation rather than separate classical and intuitionistic negation symbols. Possible answers discussed include the interderivability of the two notions of negation and the view that negation asserts a contradiction from assuming the negated proposition. The document does not conclude there is a definitive answer and suggests this as an interesting open problem area.
This document outlines the key points that will be covered in a presentation on dialectica categories as models of linear logic. It discusses four main theorems: 1) The original dialectica category DC is a model of intuitionistic linear logic without the exponential bang (!). 2) Adding a co-free monoidal comonad makes DC a model of full intuitionistic linear logic. 3) The simplified dialectica category GC is a model of classical linear logic without bang and question mark (?). 4) Adding a complicated monoidal comonad makes GC a model of full classical linear logic. The document provides brief motivations and outlines of the constructions and proofs involved in these main results. It also discusses some applications and areas for further development
The document discusses the differences between onomasiological dictionaries and ontologies. It defines onomasiological dictionaries as grouping words by semantic fields with an entry representing a concept. Ontologies are defined as representing the entities that make up the world, with concepts and relations between concepts. Both can be represented formally, though ontologies often use more formal logic-based languages while dictionaries are based on semantic fields. The document provides examples of each and how they can be visualized differently.
Lecture 2: From Semantics To Semantic-Oriented ApplicationsMarina Santini
From the "Natural Language Processing" LinkedIn group:
John Kontos, Professor of Artificial Intelligence
I wonder whether translating into formal logic is nothing more than transliteration which simply isolates the part of the text that can be reasoned upon using the simple inference mechanism of formal logic. The real problem I think lies with the part of text that CANNOT be translated one the one hand and the one that changes its meaning due to civilization advances. My own proposal is to leave NL text alone and try building inference mechanisms for the UNTRANSLATED text depending on the task requirements.
All the best
John"
1. The document discusses the need for a positive account of informal proof in mathematics, as most mathematical proofs are informal. It argues against the view that informal proofs are recipes for formal derivations.
2. The document proposes that logic should be understood more broadly as the general study of inferential actions, as informal proofs often involve actions on mathematical objects beyond propositions. Examples of such actions include diagram manipulation in Euclidean geometry.
3. The document reviews work that may support this broader view of logic in informal proofs, such as studies of reasoning with diagrams in knot theory and using Cayley graphs to prove group theory results.
This document presents a new generic approach to pattern matching of amino acid sequences using matching policy and pattern policy. It introduces two algorithms - matching policy and pattern policy. Matching policy uses the ASCII values of patterns and text to skip comparisons and reduce complexity when the values are unequal. Pattern policy selects patterns based on descending order of amino acid occurrences in the text, determined by heap sorting. Experimental results on phosphorylated peptide datasets show the proposed approach achieves a higher success ratio than conventional methods, especially when the two policies are used together. Complexity analysis shows it has low time complexity of O(N×LS×log(LS)). The approach provides an efficient and robust way to perform pattern matching of amino acid sequences.
This document provides an introduction to classical modal logic. It discusses:
1) Classical modal logic studies formulas that are necessarily true and how necessity is preserved in reasoning. Formulas of the form φ ∨ ¬φ are always valid.
2) Modal logics take a more fine-grained view than just true/false, allowing distinctions between what is necessarily true and possibly true.
3) The document uses a puzzle about three wise men with colored hats to illustrate a logic of knowledge. It defines axioms and rules for reasoning about what individuals know, including generalization, K, T, and 4.
How to Download & Install Module From the Odoo App Store in Odoo 17Celine George
Custom modules offer the flexibility to extend Odoo's capabilities, address unique requirements, and optimize workflows to align seamlessly with your organization's processes. By leveraging custom modules, businesses can unlock greater efficiency, productivity, and innovation, empowering them to stay competitive in today's dynamic market landscape. In this tutorial, we'll guide you step by step on how to easily download and install modules from the Odoo App Store.
🔥🔥🔥🔥🔥🔥🔥🔥🔥
إضغ بين إيديكم من أقوى الملازم التي صممتها
ملزمة تشريح الجهاز الهيكلي (نظري 3)
💀💀💀💀💀💀💀💀💀💀
تتميز هذهِ الملزمة بعِدة مُميزات :
1- مُترجمة ترجمة تُناسب جميع المستويات
2- تحتوي على 78 رسم توضيحي لكل كلمة موجودة بالملزمة (لكل كلمة !!!!)
#فهم_ماكو_درخ
3- دقة الكتابة والصور عالية جداً جداً جداً
4- هُنالك بعض المعلومات تم توضيحها بشكل تفصيلي جداً (تُعتبر لدى الطالب أو الطالبة بإنها معلومات مُبهمة ومع ذلك تم توضيح هذهِ المعلومات المُبهمة بشكل تفصيلي جداً
5- الملزمة تشرح نفسها ب نفسها بس تكلك تعال اقراني
6- تحتوي الملزمة في اول سلايد على خارطة تتضمن جميع تفرُعات معلومات الجهاز الهيكلي المذكورة في هذهِ الملزمة
واخيراً هذهِ الملزمة حلالٌ عليكم وإتمنى منكم إن تدعولي بالخير والصحة والعافية فقط
كل التوفيق زملائي وزميلاتي ، زميلكم محمد الذهبي 💊💊
🔥🔥🔥🔥🔥🔥🔥🔥🔥
NIPER 2024 MEMORY BASED QUESTIONS.ANSWERS TO NIPER 2024 QUESTIONS.NIPER JEE 2...
Ai 2up-05-fol
1. Artificial Intelligence
Artificial Intelligence
5. First-Order Logic
Lars Schmidt-Thieme
Information Systems and Machine Learning Lab (ISMLL)
Institute of Economics and Information Systems
& Institute of Computer Science
University of Hildesheim
http://www.ismll.uni-hildesheim.de
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 1/19
Artificial Intelligence
1. Introduction
2. Syntax
3. Semantics
4. Example
5. Conclusion
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 1/19
2. Artificial Intelligence / 1. Introduction
What is first-order logic?
Think about expressing these phrases in propositional logic:
A := “Socrates is human.”
B := “All humans are mortal.”
C := “Thus, Socrates is mortal.”
How can we see that A, B, C are related?
First-order logic is richer than propositional logic:
H(a)
∀xH(x) → M (x)
M (a)
where a stands for “Socrates”, H for “is human”, and M for “is
mortal”.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 1/19
Artificial Intelligence / 1. Introduction
What is first-order logic?
H(a)
∀xH(x) → M (x)
M (a)
So what do we have here?
– x is a variable. Variables denote arbitrary elements (objects) of
an underlying set.
– a is a constant. Constants denote specific elements of an
underlying set.
– H and M are unary relations.
– ∀ is the all quantifier. It is read “for all”.
– We can also use the connectives we already know from
propositional logic.
In first-order logic, there are also relations with other arities, as
well as n-ary functions. In addition to the all quantifier, there is
the existential quantifier, read “there exists”.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 2/19
3. Artificial Intelligence
1. Introduction
2. Syntax
3. Semantics
4. Example
5. Conclusion
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 3/19
Artificial Intelligence / 2. Syntax
Syntax: Symbols
– Let {f, g, h, . . . , f1, f2, . . . } be the set of function symbols.
Every function symbol has a given arity. Sometimes we write
f n to denote that f has arity n.
– Let {a, b, c, . . . , a1, a2, . . . } be the set of constant symbols.
Constant symbols can be seen as 0-ary function symbols.
– {P, R, S, . . . , P1, P2, . . . } be the set of relation symbols. Every
relation symbol (predicate) has a given arity. Sometimes we
write P n to denote that P has arity n.
– {x, y, z, x1, x2, . . . } be the set of variable symbols.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 3/19
4. Artificial Intelligence / 2. Syntax
Syntax: Terms
A term is a logical expression that refers to an object.
(T1) Every variable or constant symbol is a term.
(T2) If f is an n-ary function symbol and t1, . . . , tn are terms, then
f (t1, . . . , tn) is also a term.
Examples: More meaningful names for the
symbols:
– a is a term, b as well.
– f (a) is a term if f is unary. – aristotle, socrates, kallias
– f 3(a, x) is not a term. – succ(root)
– Likes(zeno, hockey),
– P (x) and P (x) ∨ Q(x) are not
Likes(stef f en, soccer) ∧
terms.
Likes(stef f en, hockey)
– f 1(f (f (a))) is a term.
– succ(succ(succ(0)))
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 4/19
Artificial Intelligence / 2. Syntax
Syntax: Formulas
An atomic formula has the form t1 = t2 or R(t1, . . . tn) is an n-ary
relation symbol and t1, . . . , tn are terms.
(F0) Every atomic formula is a formula.
(F1) If φ is a formula then so is (¬φ).
(F2) If φ and ψ are formulas then so is (φ ∧ ψ).
(F3) If φ is a formula, then so is (∃xφ) for any variable x.
We define ∨, →, and ↔ the same way as in propositional logic.
For any formula φ, (∀xφ) and (¬∃x¬φ) are interchangeable.
Unnecessary brackets can be left out as in propositional logic.
Precedence: ¬, ∃, ∀, ∧, ∨, →, ↔
Examples:
– P (x) and P (x) ∨ Q(x) are formulas if P and Q are unary.
– succ(succ(succ(0))) = 3 is a formula.
– ∀yP (x, y) is a formula and x(P (z)∃) is not.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 5/19
5. Artificial Intelligence / 2. Syntax
Syntax: Subformulas
Let φ be a formula of first-order logic. We inductively define what it
means for θ to be a subformula of φ as follows:
– If φ is atomic, then θ is a subformula of φ if and only if θ = φ.
– If φ has the form ¬ψ, then θ is a subformula of φ if and only if
θ = φ or θ is a subformula of ψ.
– If φ has the form ψ1 ∧ ψ2, then θ is a subformula of φ if and only
if θ = φ or θ is a subformula of ψ1, or θ is a subformula of ψ2.
– If φ has the form ∃xψ, then θ is a subformula of φ if and only if
θ = φ or θ is a subformula of ψ.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 6/19
Artificial Intelligence / 2. Syntax
Syntax: Free variables
The free variables of a formula are those variables occurring in it
that are not quantified.
Example: In ∀yR(x, y), x is free, but y is bound by ∀y.
For any first-order formula φ, let f ree(φ) denote the set of free
variables of φ. We define f ree(φ) inductively as follows:
– If φ is atomic, then f ree(φ) is the set of all variables occurring in
φ,
– if φ = ¬ψ, then f ree(φ) = f ree(ψ),
– if φ = ψ ∧ θ, then f ree(φ) = f ree(ψ) ∪ f ree(θ), and
– if φ = ∃xψ, then f ree(φ) = f ree(ψ) − {x}.
How would you define the set of bound variables of φ, bnd(φ)?
A sentence of first-order logic is a formula having no free
variables.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 7/19
6. Artificial Intelligence
1. Introduction
2. Syntax
3. Semantics
4. Example
5. Conclusion
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 8/19
Artificial Intelligence / 3. Semantics
Semantics: Vocabularies, structures and interpretations
A vocabulary is a set of function, relation, and constant symbols.
Let V be a vocabulary. A V-structure M = (U, I) consists of a
nonempty underlying set U (the universe) along with an
interpretation I of V. An interpretation I of V assigns:
– an element of U to each constant symbol in V,
– a function from U n to U to each n-ary function in V, and
– a subset of U n to each n-ary relation in V.
Examples:
– V = {f 1, R2, c}, Z = (Z, IZ)
The universe is the set of integers Z.
IZ could interpret f (x) as x2, R(x, y) as x < y, and c as 3.
– V = {f 1, R2, c}, N = (N, IN)
The universe is the set of natural numbers N.
IN could interpret f (x) as x + 1, R(x, y) as x < y, and c as 0.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 8/19
7. Artificial Intelligence / 3. Semantics
Semantics: V-formulas and V-sentences
Let V be a vocabulary. A V-formula is formula in which every
function, relation, and constant symbol is in V. A V-sentence is a
V-formula that is a sentence.
If M is a V-structure, then each V-sentence φ is either true or false
in M. If φ is true in M, then we say M models φ and write M |= φ.
Example: Var = {+, ·, 0, 1} is the vocabulary of arithmetic. Then
R = (R, IR) is an Var -structure if IR is a interpretation of Var .
R |= ∀x∃y(1 + x · x = y)
What about ∀y∃x(1 + x · x = y)?
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 9/19
Artificial Intelligence / 3. Semantics
Semantics: The value of terms
We define the value VM (t) ∈ U of a term t inductively as
– VM (t) = IM (t), if t is a constant symbol, and
– VM (t) = IM (f )(VM (t1), . . . , VM (tn)), if t = f n(t1, . . . , tn).
Example: V = {f 1, R2, c}, N = (N, IN), interpretation IN as before
What is the value of the term t = f (f (c))?
VN(f (f (c))) = IN(f )(VN(f (c)))
= IN(f )(IN(f )(VN(c)))
= IN(f )(IN(f )(IN(c)))
= IN(f )(IN(f )(0))
= IN(f )(1)
=2
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 10/19
8. Artificial Intelligence / 3. Semantics
Semantics: Vocabulary/structure expansions and reducts
An expansion of a vocabulary V is a vocabulary containing V as a
subset.
A structure M is an expansion of the V-structure M if M has the
same universe and interprets the symbols of V in the same way
as M .
If M is an expansion of M , then we say that M is a reduct of M .
Examples:
The {+, −, ·, <, 0, 1}-structure M = (R, I ) is an expansion of the
Var -structure M = (R, I) if both I and I interpret the symbols
+, ·, 0, and 1 in the usual way.
A {+, −, ·, <, 0, 1}-structure M = (Q, I ) cannot be an expansion
of M .
Any structure is an expansion of itself.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 11/19
Artificial Intelligence / 3. Semantics
Semantics: The value of formulas
We define M |= φ by induction:
–M |= t1 = t2 if and only if V (t1) = V (t2),
–M |= Rn(t1, . . . , tn) iff. (VM (t1), . . . , VM (tn)) ∈ IM (Rn),
–M |= ¬φ iff. M does not model φ,
–M |= φ1 ∧ φ2 iff. both M |= φ1 and M |= φ2, and
–M |= ∃xφ(x) iff. MC |= φ(c) for some constant c ∈ V(M ) .
V(M ) = V ∪ {cm|m ∈ UM }
MC = (UM , IC ) is the expansion of M = (UM , I) to a
V(M )-structure where IC interprets each cm as the element m.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 12/19
9. Artificial Intelligence
1. Introduction
2. Syntax
3. Semantics
4. Example
5. Conclusion
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 13/19
Artificial Intelligence / 4. Example
Back to the Silly Example
Toy example by Gregory Yob (1975), adapted by our textbook.
– 4 × 4 grid, tiles numbered (1,1) to (4,4),
– the agent starts in (1,1),
– the beast Wumpus sits at a random tile, unknown to the agent,
– a pile of gold sits at another random tile, unknown to the agent,
– some pits are located at random tiles, unknown to the agent.
– if the agent enters the tile of the Wumpus, he will be eaten,
– if the agent enters a pit, he will be trapped,
Bree z e PIT
4 Stench
Bree z e
3 Stench PIT Bree z e
Gold
Bree z e
2 Stench
Bree z e Bree z e
1 PIT
START
1 2 3 4
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 13/19
10. Artificial Intelligence / 4. Example
Encoding in propositional logic
64 variables:
Px,y tile x, y contains a pit (x, y = 1, . . . , 4).
Wx,y tile x, y contains the Wumpus (x, y = 1, . . . , 4).
Bx,y tile x, y contains a breeze (x, y = 1, . . . , 4).
Sx,y tile x, y contains stench (x, y = 1, . . . , 4).
start is save: (2 formulas)
¬P1,1, ¬W1,1
how breeze arises: (16 formulas)
Bx,y ↔ Px−1,y ∨ Px+1,y ∨ Px,y−1 ∨ Px,y+1, x, y = 1, . . . , 4
how stench arises: (16 formulas)
Sx,y ↔ Wx−1,y ∨ Wx+1,y ∨ Wx,y−1 ∨ Wx,y+1, x, y = 1, . . . , 4
there is exactly one Wumpus: (121 formulas)
W1,1 ∨ W1,2 ∨ . . . ∨ W4,4
¬Wx,y ∨ ¬Wx ,y , x, y, x , y = 1, . . . , 4, x = x or y = y
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 14/19
Artificial Intelligence / 4. Example
Encoding in first logic (1/2)
Vocabulary:
– constants 1, 2, 3, 4
– binary relations symbols P, W, B, S
Meaning of the predicates:
P (x, y) tile x, y contains a pit.
W (x, y) tile x, y contains the Wumpus.
B(x, y) tile x, y contains a breeze.
S(x, y) tile x, y contains stench.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 15/19
11. Artificial Intelligence / 4. Example
Encoding in first logic (2/2)
start is save:
¬P (1, 1) ∧ ¬W (1, 1)
how breeze arises:
∀x∀yB(x, y) ↔ P (x − 1, y) ∨ P (x + 1, y) ∨ P (x, y − 1) ∨ P (x, y + 1)
there is exactly one Wumpus:
W (x, y) → ∀x ∀y W (x , y ) → (x = x ∧ y = y )
Further possibilities: Encode actions as functions, encode time
steps.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 16/19
Artificial Intelligence
1. Introduction
2. Syntax
3. Semantics
4. Example
5. Conclusion
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 17/19
12. Artificial Intelligence / 5. Conclusion
Summary
We introduced first-order logic, a representation language far
more powerful than propositional logic.
– Knowledge representation should be declarative,
compositional, expressive, context-independent, and
unambiguous.
– Constant symbols name objects, relation symbols
(predicates) name properties and relations, and function
symbols name functions. Complex terms apply function
symbols to terms to name an object.
– Given a V-structure, the truth of a formula is determined.
– An atomic formula consists of a relation symbol applied to one
or more terms; it is true iff. the relation named by the predicate
holds between the objects named by the terms. Complex
formulas use connectives just like propositional logic.
Quantifiers allow the expression of general rules.
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 17/19
Artificial Intelligence / 5. Conclusion
Outlook
Next lesson: Inference in first-order predicate logic.
Which other kind of logics exist?
– Temporal logic: Gφ → Xφ
– Description logic: C ⊆ D
– Modal logic: p → p
– Higher-order predicate logic: ∀P ∀x∀yP (x, y) ∧ P (y, x) → S(P )
– Typed/intuitionistic/default/relevance logics
– ...
(not covered in this course)
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 18/19
13. Artificial Intelligence / 5. Conclusion
Literature
– Shawn Hedman: A First Course in Logic
– Heinz-Dieter Ebbinghaus, Jörg Flum, Wolfgang Thomas:
Einführung in die mathematische Logik
– Uwe Schöning: Logik für Informatiker
– Stuart Russell, Peter Norvig: Artificial Intelligence. A Modern
Approach
Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany,
Course on Artificial Intelligence, summer term 2008 19/19