This document discusses reasoning and the semantic web. It begins with an overview of constraint logic programming and unification. It then discusses how semantic web objects can be represented as labeled graphs. It explains how graphs can be modeled as constraints. The document also covers OWL and description logic based reasoning. It proposes using constraint-based techniques for semantic web reasoning. It concludes with a recap of the key topics covered.
This document introduces predicate logic, including predicate symbols and signatures, logical connectives like negation and conjunction, quantifiers like universal and existential, the syntax and semantics of predicate logic formulas, and some useful equivalences. Predicate logic allows representing relations between objects using predicates of varying arities and expressing properties of collections using quantifiers. Formulas in predicate logic can be built from predicate and function symbols, terms, quantifiers, and logical connectives.
The document discusses syntax, semantics, and intended meaning in symbol systems. It defines:
- Syntax as symbols and rules for composition into structures
- Semantics as the relationship between syntax and intended meaning
- Intended meaning as truths about the world that the symbol system represents
It then summarizes key concepts in logic programming including terms, substitutions, instances, queries, rules, and the declarative and operational semantics of logic programs.
This document discusses Prolog programming. It covers data structures in Prolog like lists and terms, programming techniques like guess-and-verify queries and open lists, and control in Prolog through goal ordering and rule selection. Lists can represent data structures and terms correspond to tree structures. Variables serve as placeholders and open lists allow modification of data. Control is characterized by selecting the leftmost goal and first applicable rule.
The document discusses regular languages and regular expressions. It begins by defining regular expressions over an alphabet and the basic operations of concatenation, union, and Kleene star. It then defines regular languages as the languages corresponding to regular expressions. Some examples of regular languages and expressions over the binary alphabet are given. It proves that the class of languages accepted by finite automata is equivalent to the class of regular languages. It also discusses properties like closure under operations and the pumping lemma for regular languages.
This document provides an overview of logic programming and the logic programming language Prolog. It discusses declarative programming and how Prolog uses declarative rules, facts, and predicates. It explains how Prolog performs logical operations like unification and resolution to evaluate queries against its knowledge base. It provides examples of using Prolog to represent graphs, lists, arithmetic, and more.
This document discusses separating shuffle regular expressions (SSRE) for describing languages of data words. SSRE extend regular expressions with a separating shuffle operation. The document defines SSRE and proves several results about their expressiveness and decidability properties in comparison to register automata, data automata, and first-order logic on data words. Key results include: 1) every data automata definable language is definable by a SSRE with homomorphisms; 2) the emptiness problem for SSRE with homomorphisms is undecidable; and 3) there are languages definable by SSRE that cannot be defined by register automata. The document raises several open questions and conjectures about SSRE and their relationships to other
This document provides an overview of first-order logic including:
- First-order logic is a formal system used in mathematics, philosophy, linguistics and computer science to represent knowledge.
- It models the world in terms of objects, properties, relations and functions.
- The syntax of first-order logic includes constant symbols, function symbols, predicate symbols, variables, and connectives like not, and, or as well as quantifiers like universal and existential.
- Examples show how first-order logic can represent statements about individuals and their relationships using predicates, terms, atomic and complex sentences with quantifiers.
This document provides an introduction to logic and set theory. It begins by defining key logic concepts such as propositions, truth values, and logical operators. It then explains how logical operators can combine propositions using truth tables. The document also discusses tautologies and contradictions. It introduces quantification and propositional functions. Finally, it provides examples of sets and set operations before transitioning to a discussion of set theory.
This document introduces predicate logic, including predicate symbols and signatures, logical connectives like negation and conjunction, quantifiers like universal and existential, the syntax and semantics of predicate logic formulas, and some useful equivalences. Predicate logic allows representing relations between objects using predicates of varying arities and expressing properties of collections using quantifiers. Formulas in predicate logic can be built from predicate and function symbols, terms, quantifiers, and logical connectives.
The document discusses syntax, semantics, and intended meaning in symbol systems. It defines:
- Syntax as symbols and rules for composition into structures
- Semantics as the relationship between syntax and intended meaning
- Intended meaning as truths about the world that the symbol system represents
It then summarizes key concepts in logic programming including terms, substitutions, instances, queries, rules, and the declarative and operational semantics of logic programs.
This document discusses Prolog programming. It covers data structures in Prolog like lists and terms, programming techniques like guess-and-verify queries and open lists, and control in Prolog through goal ordering and rule selection. Lists can represent data structures and terms correspond to tree structures. Variables serve as placeholders and open lists allow modification of data. Control is characterized by selecting the leftmost goal and first applicable rule.
The document discusses regular languages and regular expressions. It begins by defining regular expressions over an alphabet and the basic operations of concatenation, union, and Kleene star. It then defines regular languages as the languages corresponding to regular expressions. Some examples of regular languages and expressions over the binary alphabet are given. It proves that the class of languages accepted by finite automata is equivalent to the class of regular languages. It also discusses properties like closure under operations and the pumping lemma for regular languages.
This document provides an overview of logic programming and the logic programming language Prolog. It discusses declarative programming and how Prolog uses declarative rules, facts, and predicates. It explains how Prolog performs logical operations like unification and resolution to evaluate queries against its knowledge base. It provides examples of using Prolog to represent graphs, lists, arithmetic, and more.
This document discusses separating shuffle regular expressions (SSRE) for describing languages of data words. SSRE extend regular expressions with a separating shuffle operation. The document defines SSRE and proves several results about their expressiveness and decidability properties in comparison to register automata, data automata, and first-order logic on data words. Key results include: 1) every data automata definable language is definable by a SSRE with homomorphisms; 2) the emptiness problem for SSRE with homomorphisms is undecidable; and 3) there are languages definable by SSRE that cannot be defined by register automata. The document raises several open questions and conjectures about SSRE and their relationships to other
This document provides an overview of first-order logic including:
- First-order logic is a formal system used in mathematics, philosophy, linguistics and computer science to represent knowledge.
- It models the world in terms of objects, properties, relations and functions.
- The syntax of first-order logic includes constant symbols, function symbols, predicate symbols, variables, and connectives like not, and, or as well as quantifiers like universal and existential.
- Examples show how first-order logic can represent statements about individuals and their relationships using predicates, terms, atomic and complex sentences with quantifiers.
This document provides an introduction to logic and set theory. It begins by defining key logic concepts such as propositions, truth values, and logical operators. It then explains how logical operators can combine propositions using truth tables. The document also discusses tautologies and contradictions. It introduces quantification and propositional functions. Finally, it provides examples of sets and set operations before transitioning to a discussion of set theory.
Date: March 9, 2016
Course: UiS DAT911 - Foundations of Computer Science (fall 2016)
Please cite, link to or credit this presentation when using it or part of it in your work.
This document discusses properties that a good domain description for reasoning about actions should have beyond mere consistency. It introduces the concept of modularity for action theories, where the different types of laws (static, effect, executability, inexecutability) are arranged in separate components with limited interaction. Violations of the proposed postulates about modularity can lead to unexpected conclusions from logically consistent theories. The document outlines algorithms to check whether an action theory satisfies the postulates of modularity.
This document provides an overview of propositional logic and logical operators. It defines basic concepts like propositions, logical connectives, and truth tables. Compound propositions are formed by combining one or more propositions using logical operators like conjunction, disjunction, negation, implication, equivalence, exclusive or, and others. Computer representations of logic using bits are also discussed, where true and false map to 1 and 0, and bitwise logic operators correspond directly to the logical connectives. Precedence rules for logical operators are defined.
This document summarizes Yunhao He's thesis on weak predictable representation properties and quadratic BSDEs. The thesis has three main objectives: 1) Prove that for a continuous strong Markov local martingale M, the Galtchouk-Kunita-Watanabe decomposition of E[F(MT)] has no orthogonal component for sufficiently smooth functions F. 2) Establish existence of solutions to quadratic BSDEs driven by a continuous local martingale. 3) Show that if M has independent increments, the solution to a quadratic BSDE with terminal value F(MT) has no orthogonal component under any filtration where solutions exist. The document defines key terms and notations used in the thesis and outlines the structure, which
This document provides an introduction to logic, including propositional logic and predicate calculus. It defines key concepts such as logical values, propositions, operators, truth tables, logical expressions, worlds, models, inference rules, quantification, and definitions. Propositional logic manipulates true and false values using operators like AND and OR. Predicate calculus extends this to allow predicates, constants, functions, and quantification over variables. Inference involves applying rules to derive new statements, but the search space grows too large to feasibly perform by hand.
The document describes a method for learning incoherent dictionaries using iterative projections and rotations (IPR). It begins with background on dictionary learning models and algorithms, as well as previous work on learning incoherent dictionaries. The IPR algorithm constructs Grassmannian frames, which have minimal mutual coherence, using iterative projections of the dictionary's Gram matrix onto constraint sets, followed by a rotation step. Numerical experiments show that dictionaries learned with IPR have lower incoherence and perform well for sparse approximation compared to existing methods.
Metalogic: The non-algorithmic side of the mindHaskell Lambda
The document discusses metalogic, which is the study of metatheories of logic. It defines metalogic and contrasts it with logic. It then discusses classical and quantum metalanguages, Tarski's Convention T, and Gödel's incompleteness theorems. The key points are:
1) Metalogic studies the properties of logical systems themselves, not arguments within a system like logic does.
2) A quantum metalanguage assigns degrees of certainty to assertions rather than treating them classically.
3) Tarski's Convention T relates sentences to their truth values, and this is generalized to Convention PT for quantum logic.
4) Gödel's incompleteness theorems
Last time we talked about propositional logic, a logic on simple statements.
This time we will talk about first order logic, a logic on quantified statements.
First order logic is much more expressive than propositional logic.
The topics on first order logic are:
1-Quantifiers
2-Negation
3-Multiple quantifiers
4-Arguments of quantified statements
Algorithms and Complexity: Cryptography TheoryAlex Prut
This document discusses algorithms and complexity in cryptography. It begins by defining function problems in computational complexity theory as computational problems where the expected output is more complex than a simple yes or no answer. It then discusses one-way functions, which are easy to compute but believed to be hard to invert. The document provides examples of one-way functions based on integer multiplication, discrete logarithms, and the RSA cryptosystem. It argues that the existence of one-way functions separates the complexity classes P and UP, and that cryptography relies on the assumption that stronger versions of one-way functions exist.
This document discusses modal logics and formalisms. It defines modal logics as logics that add new logical constants like necessity (□) and possibility (◇) to classical logic. It describes how modal logics can be classified based on whether they are extended logics that add new well-formed formulas or deviant logics that interpret the usual logical constants differently. The document then focuses on modal logics, defining their language and providing details on their model theory using possible world semantics. It discusses truth in possible worlds and models. It also describes several axiomatic modal systems and the relationships between them, and examines the classes of models validated by different axioms.
Automata theory studies abstract computing devices and the types of tasks they are capable of. Alan Turing pioneered this field in the 1930s by studying Turing machines. The theory examines questions of computability and complexity. It establishes a hierarchy of formal language classes from regular to recursively enumerable. Proofs in automata theory demonstrate properties of languages and machines through techniques like deduction, induction, contradiction, and counterexamples. Key concepts include alphabets, strings, languages, and the membership problem of determining if a string belongs to a language.
The document discusses relational database design and functional dependencies. It defines functional dependencies and provides examples. It describes different types of functional dependencies like trivial, non-trivial, and how functional dependencies relate to keys. It explains Armstrong's axioms for reasoning about functional dependencies and various properties and algorithms related to functional dependencies like closure of sets of attributes and functional dependencies.
This document discusses modal logics and formalisms. It begins by defining classical and non-classical logics, with modal logics listed as an example of an extended logic. It then covers modal logics in more detail, defining their language and model theory using possible world semantics. Models are defined as structures consisting of possible worlds related by an accessibility relation. Truth is evaluated at possible worlds based on this relation. The document also discusses axiomatic modal logics like KT and relations between main modal systems. Finally, it notes that axioms like D, T, B, 4 and 5 are not valid in the class of all standard models.
Here is a proof of this statement using resolution refutation:
1. ∀x∀y(F(x) ∧ F(y) ∧ L(x,y)) → S(x,y) (Premise: Any fish larger can swim faster)
2. ∃x∀y(F(y) → L(x,y)) (Premise: There exists a largest fish)
3. ∃x∀y(F(y) → S(x,y)) (Goal: There exists a fastest fish)
4. F(a) ∧ F(b) ∧ L(a,b) → S(a
Este cuestionario de informática trata sobre conceptos básicos de hardware y software. Explora temas como el sistema operativo, la CPU, la memoria RAM y ROM, los discos duros, buses, puertos y versiones de sistemas operativos. El cuestionario consta de 35 preguntas diseñadas para evaluar la comprensión del estudiante sobre estos fundamentos de informática.
In handheld Augmented Reality (AR) the magic-lens paradigm is typically implemented by rendering the video stream captured by the back-facing camera onto the device’s screen. Unfortunately, such implementations show the real world from the device’s perspective rather than the user’s perspective. This dual-perspective results in misaligned and incorrectly scaled imagery, a predominate cause for the dual-view problem with potential to distort user’s spatial perception. This paper presents a user study that analyzes users’ expectations, spatial-perception, and their ability to deal with the dual-view problem, by comparing device-perspective and fixed Point-of-View (POV) user-perspective rendering. The results confirm the existence of the dual-view perceptual issue and that the majority of participants expect user-perspective rendering irrespective of their previous AR experience. Participants also demonstrated significantly better spatial perception and preference of the user-perspective view.
Full Paper: http://dl.acm.org/citation.cfm?id=2522848.2522885
El documento presenta 30 preguntas sobre la historia de las computadoras, desde la Primera Generación basada en tubos de vacío hasta la Quinta Generación y el desarrollo de la inteligencia artificial. Cubre temas como los pioneros de la computación como Charles Babbage y John Von Neumann, las características de computadoras como la ENIAC, UNIVAC y IBM 360, y el surgimiento de nuevas generaciones impulsadas por los transistores, circuitos integrados y microprocesadores. Finalmente, hace preguntas sobre conceptos clave de cada generación y personas influy
Este documento anuncia a abertura de inscrições para um processo seletivo simplificado para cursos técnicos de nível médio na modalidade a distância, oferecidos pelo Instituto Federal de Educação, Ciência e Tecnologia do Maranhão para servidores públicos da educação básica do estado. O documento detalha os requisitos para inscrição, as vagas disponíveis, o sistema de cotas, o processo de inscrição e a documentação necessária.
Este guia fornece orientações sobre o processo de certificação de conclusão do Ensino Médio com base nos resultados do Exame Nacional do Ensino Médio (ENEM), descrevendo quem pode solicitar a certificação, as responsabilidades das instituições certificadoras e os documentos de certificação.
Manuscript Kuno yang pernah dibuat pada tahun 1555 oleh seseorang berkebangsaan Arab (Piri Reis), yang digunakan sebagai pedoman navigasi pelayaran di masa kerajaan Ottoman - Turki (Sultan Mahmud IV).
Date: March 9, 2016
Course: UiS DAT911 - Foundations of Computer Science (fall 2016)
Please cite, link to or credit this presentation when using it or part of it in your work.
This document discusses properties that a good domain description for reasoning about actions should have beyond mere consistency. It introduces the concept of modularity for action theories, where the different types of laws (static, effect, executability, inexecutability) are arranged in separate components with limited interaction. Violations of the proposed postulates about modularity can lead to unexpected conclusions from logically consistent theories. The document outlines algorithms to check whether an action theory satisfies the postulates of modularity.
This document provides an overview of propositional logic and logical operators. It defines basic concepts like propositions, logical connectives, and truth tables. Compound propositions are formed by combining one or more propositions using logical operators like conjunction, disjunction, negation, implication, equivalence, exclusive or, and others. Computer representations of logic using bits are also discussed, where true and false map to 1 and 0, and bitwise logic operators correspond directly to the logical connectives. Precedence rules for logical operators are defined.
This document summarizes Yunhao He's thesis on weak predictable representation properties and quadratic BSDEs. The thesis has three main objectives: 1) Prove that for a continuous strong Markov local martingale M, the Galtchouk-Kunita-Watanabe decomposition of E[F(MT)] has no orthogonal component for sufficiently smooth functions F. 2) Establish existence of solutions to quadratic BSDEs driven by a continuous local martingale. 3) Show that if M has independent increments, the solution to a quadratic BSDE with terminal value F(MT) has no orthogonal component under any filtration where solutions exist. The document defines key terms and notations used in the thesis and outlines the structure, which
This document provides an introduction to logic, including propositional logic and predicate calculus. It defines key concepts such as logical values, propositions, operators, truth tables, logical expressions, worlds, models, inference rules, quantification, and definitions. Propositional logic manipulates true and false values using operators like AND and OR. Predicate calculus extends this to allow predicates, constants, functions, and quantification over variables. Inference involves applying rules to derive new statements, but the search space grows too large to feasibly perform by hand.
The document describes a method for learning incoherent dictionaries using iterative projections and rotations (IPR). It begins with background on dictionary learning models and algorithms, as well as previous work on learning incoherent dictionaries. The IPR algorithm constructs Grassmannian frames, which have minimal mutual coherence, using iterative projections of the dictionary's Gram matrix onto constraint sets, followed by a rotation step. Numerical experiments show that dictionaries learned with IPR have lower incoherence and perform well for sparse approximation compared to existing methods.
Metalogic: The non-algorithmic side of the mindHaskell Lambda
The document discusses metalogic, which is the study of metatheories of logic. It defines metalogic and contrasts it with logic. It then discusses classical and quantum metalanguages, Tarski's Convention T, and Gödel's incompleteness theorems. The key points are:
1) Metalogic studies the properties of logical systems themselves, not arguments within a system like logic does.
2) A quantum metalanguage assigns degrees of certainty to assertions rather than treating them classically.
3) Tarski's Convention T relates sentences to their truth values, and this is generalized to Convention PT for quantum logic.
4) Gödel's incompleteness theorems
Last time we talked about propositional logic, a logic on simple statements.
This time we will talk about first order logic, a logic on quantified statements.
First order logic is much more expressive than propositional logic.
The topics on first order logic are:
1-Quantifiers
2-Negation
3-Multiple quantifiers
4-Arguments of quantified statements
Algorithms and Complexity: Cryptography TheoryAlex Prut
This document discusses algorithms and complexity in cryptography. It begins by defining function problems in computational complexity theory as computational problems where the expected output is more complex than a simple yes or no answer. It then discusses one-way functions, which are easy to compute but believed to be hard to invert. The document provides examples of one-way functions based on integer multiplication, discrete logarithms, and the RSA cryptosystem. It argues that the existence of one-way functions separates the complexity classes P and UP, and that cryptography relies on the assumption that stronger versions of one-way functions exist.
This document discusses modal logics and formalisms. It defines modal logics as logics that add new logical constants like necessity (□) and possibility (◇) to classical logic. It describes how modal logics can be classified based on whether they are extended logics that add new well-formed formulas or deviant logics that interpret the usual logical constants differently. The document then focuses on modal logics, defining their language and providing details on their model theory using possible world semantics. It discusses truth in possible worlds and models. It also describes several axiomatic modal systems and the relationships between them, and examines the classes of models validated by different axioms.
Automata theory studies abstract computing devices and the types of tasks they are capable of. Alan Turing pioneered this field in the 1930s by studying Turing machines. The theory examines questions of computability and complexity. It establishes a hierarchy of formal language classes from regular to recursively enumerable. Proofs in automata theory demonstrate properties of languages and machines through techniques like deduction, induction, contradiction, and counterexamples. Key concepts include alphabets, strings, languages, and the membership problem of determining if a string belongs to a language.
The document discusses relational database design and functional dependencies. It defines functional dependencies and provides examples. It describes different types of functional dependencies like trivial, non-trivial, and how functional dependencies relate to keys. It explains Armstrong's axioms for reasoning about functional dependencies and various properties and algorithms related to functional dependencies like closure of sets of attributes and functional dependencies.
This document discusses modal logics and formalisms. It begins by defining classical and non-classical logics, with modal logics listed as an example of an extended logic. It then covers modal logics in more detail, defining their language and model theory using possible world semantics. Models are defined as structures consisting of possible worlds related by an accessibility relation. Truth is evaluated at possible worlds based on this relation. The document also discusses axiomatic modal logics like KT and relations between main modal systems. Finally, it notes that axioms like D, T, B, 4 and 5 are not valid in the class of all standard models.
Here is a proof of this statement using resolution refutation:
1. ∀x∀y(F(x) ∧ F(y) ∧ L(x,y)) → S(x,y) (Premise: Any fish larger can swim faster)
2. ∃x∀y(F(y) → L(x,y)) (Premise: There exists a largest fish)
3. ∃x∀y(F(y) → S(x,y)) (Goal: There exists a fastest fish)
4. F(a) ∧ F(b) ∧ L(a,b) → S(a
Este cuestionario de informática trata sobre conceptos básicos de hardware y software. Explora temas como el sistema operativo, la CPU, la memoria RAM y ROM, los discos duros, buses, puertos y versiones de sistemas operativos. El cuestionario consta de 35 preguntas diseñadas para evaluar la comprensión del estudiante sobre estos fundamentos de informática.
In handheld Augmented Reality (AR) the magic-lens paradigm is typically implemented by rendering the video stream captured by the back-facing camera onto the device’s screen. Unfortunately, such implementations show the real world from the device’s perspective rather than the user’s perspective. This dual-perspective results in misaligned and incorrectly scaled imagery, a predominate cause for the dual-view problem with potential to distort user’s spatial perception. This paper presents a user study that analyzes users’ expectations, spatial-perception, and their ability to deal with the dual-view problem, by comparing device-perspective and fixed Point-of-View (POV) user-perspective rendering. The results confirm the existence of the dual-view perceptual issue and that the majority of participants expect user-perspective rendering irrespective of their previous AR experience. Participants also demonstrated significantly better spatial perception and preference of the user-perspective view.
Full Paper: http://dl.acm.org/citation.cfm?id=2522848.2522885
El documento presenta 30 preguntas sobre la historia de las computadoras, desde la Primera Generación basada en tubos de vacío hasta la Quinta Generación y el desarrollo de la inteligencia artificial. Cubre temas como los pioneros de la computación como Charles Babbage y John Von Neumann, las características de computadoras como la ENIAC, UNIVAC y IBM 360, y el surgimiento de nuevas generaciones impulsadas por los transistores, circuitos integrados y microprocesadores. Finalmente, hace preguntas sobre conceptos clave de cada generación y personas influy
Este documento anuncia a abertura de inscrições para um processo seletivo simplificado para cursos técnicos de nível médio na modalidade a distância, oferecidos pelo Instituto Federal de Educação, Ciência e Tecnologia do Maranhão para servidores públicos da educação básica do estado. O documento detalha os requisitos para inscrição, as vagas disponíveis, o sistema de cotas, o processo de inscrição e a documentação necessária.
Este guia fornece orientações sobre o processo de certificação de conclusão do Ensino Médio com base nos resultados do Exame Nacional do Ensino Médio (ENEM), descrevendo quem pode solicitar a certificação, as responsabilidades das instituições certificadoras e os documentos de certificação.
Manuscript Kuno yang pernah dibuat pada tahun 1555 oleh seseorang berkebangsaan Arab (Piri Reis), yang digunakan sebagai pedoman navigasi pelayaran di masa kerajaan Ottoman - Turki (Sultan Mahmud IV).
1) A prefeitura municipal de Colômbia-SP convoca candidatos para comparecerem à sessão de atribuição de classes/aulas munidos dos documentos necessários.
2) O documento descreve os requisitos de escolaridade para as funções temporárias de professor, incluindo modelos de declaração sobre acúmulo de cargos.
3) Os candidatos devem declarar por escrito se acumulam ou não outros cargos públicos.
Semantic Web technologies are a set of languages standardized by the World Wide Web Consortium (W3C) and designed to create a web of data that can be processed by machines. One of the core languages of the Semantic Web is Web Ontology Language (OWL), a family of knowledge representation languages for authoring ontologies or knowledge bases. The newest OWL is based on Description Logics (DL), a family of logics that are decidable fragments of first-order logic. leanCoR is a new description logic reasoner designed for experimenting with the new connection method algorithms and optimization techniques for DL. leanCoR is an extension of leanCoP, a compact automated theorem prover for classical first-order logic.
First-order logic (FOL) is a formal system used in mathematics, philosophy, linguistics, and computer science to represent knowledge about domains involving objects and relations. FOL extends propositional logic with quantifiers and predicates to describe properties of and relations between objects. Well-formed formulas in FOL involve constants, variables, functions, predicates, quantifiers, and logical connectives. The meaning and truth of FOL statements is determined with respect to a structure called a model that specifies a domain of objects and interpretations of symbols. FOL can be used to represent knowledge about many different domains and perform logical inference.
Herbrand-satisfiability of a Quantified Set-theoretical Fragment (Cantone, Lo...Cristiano Longo
The document discusses the quantified fragment of set theory called ∀π0. ∀π0 allows for restricted quantification over sets and ordered pairs. A decision procedure for the satisfiability of ∀π0 formulas works by non-deterministically guessing a skeletal representation and checking if its realization is a model of the formula. The document considers encoding the conditions on skeletal representations as first-order formulas to view ∀π0 as a first-order logic and leverage tools developed for first-order logic fragments.
Master Thesis on the Mathematial Analysis of Neural NetworksAlina Leidinger
Master Thesis submitted on June 15, 2019 at TUM's chair of Applied Numerical Analysis (M15) at the Mathematics Department.The project was supervised by Prof. Dr. Massimo Fornasier. The thesis took a detailed look at the existing mathematical analysis of neural networks focusing on 3 key aspects: Modern and classical results in approximation theory, robustness and Scattering Networks introduced by Mallat, as well as unique identification of neural network weights. See also the one page summary available on Slideshare.
This document provides a lab manual for an Artificial Intelligence laboratory course. It includes an index listing 9 experiments covering topics like Prolog programming, solving problems using Prolog like the 8-Queen problem, and optional content beyond the syllabus. The first experiment provides an overview of the Prolog language, covering basic terms, facts and rules, lists, recursion, and backtracking in Prolog.
Big O notation describes how efficiently an algorithm or function grows as the input size increases. It focuses on the worst-case scenario and ignores constant factors. Common time complexities include O(1) for constant time, O(n) for linear time, and O(n^2) for quadratic time. To determine an algorithm's complexity, its operations are analyzed, such as the number of statements, loops, and function calls.
Presentazione di Pierpaolo Basile, durante il suo talk dal titolo "Geometria e Semantica del Linguaggio.
L'incontro si è tenuto il giorno 17 Dicembre 2014 all'interno del progetto SSC (Scientific Storming Café).
L'abstract del talk è "Rappresentare concetti in uno spazio geometrico è una tecnica ampiamente utilizzata nell'informatica per modellare la semantica del linguaggio naturale. Ad esempio i motori di ricerca che interroghiamo ogni giorno utilizzano la geometria per rappresentare parole e documenti. Obiettivo del talk è introdurre i concetti di base dei modelli di semantica distribuzionale e presentare alcuni operatori geometrici per la composizione dei termini per rappresentare concetti più complessi come frasi o interi documenti"
The document discusses database schema refinement through normalization. It introduces the concepts of functional dependencies and normal forms including 1NF, 2NF, 3NF and BCNF. Decomposition is presented as a technique to resolve issues like redundancy, update anomalies and insertion/deletion anomalies that arise due to violations of normal forms. Reasoning about functional dependencies and computing their closure is also covered.
This document summarizes information about ER diagrams, schema refinement, and database normalization. It provides examples of ER diagrams and how they can be converted to tables. It discusses different normal forms including Boyce-Codd normal form (BCNF) and third normal form (3NF), and provides algorithms for decomposing a schema into BCNF and 3NF. The goal of normalization is to reduce data redundancy and avoid data anomalies.
The document discusses low-rank matrix optimization problems and heuristics for solving rank minimization problems. It covers the following key points in 3 sentences:
The document outlines motivation for extracting low-dimensional structures from high-dimensional data using rank minimization. It then discusses several heuristics for approximating the non-convex rank minimization problem, including replacing the rank with the nuclear norm, using the log-det heuristic as a smooth surrogate, matrix factorization methods, and iteratively solving a sequence of rank-constrained convex problems. Applications mentioned include the Netflix Prize and video intrusion detection.
Building WordSpaces via Random Indexing from simple to complex spacesPierpaolo Basile
This presentation describes two approaches to compositional semantics in distributional semantic spaces.
Both approaches conceive the semantics of complex structures, such as phrases or sentences, as being other than
the sum of its terms.
Syntax is the plus used as a glue to compose words.
The former kind of approach encodes information about syntactic dependencies directly into distributional spaces, the latter exploits compositional operators reflecting the syntactic role of words.
Prolog Programming is a document about the Prolog programming language. It discusses key features of Prolog such as logical variables, unification, backtracking, defining procedures with clauses, and using Prolog as a relational database. It provides examples of Prolog code including simple programs involving terms, facts, and rules. It also covers syntax of Prolog terms and compound terms, and how Prolog interprets clauses both declaratively and procedurally.
1. Hash tables are good for random access of elements but not sequential access. When records need to be accessed sequentially, hashing can be problematic because elements are stored in random locations instead of consecutively.
2. To find the successor of a node in a binary search tree, we take the right child. This operation has a runtime complexity of O(1).
3. When comparing operations like insertion, deletion, and searching between different data structures, arrays generally have the best performance for insertion and searching, while linked lists have better performance for deletion and allow for easy insertion/deletion anywhere. Binary search trees fall between these two.
Logic programming deals with relations rather than functions. It separates logic from control by having the programmer declare facts and relations that are true, while the system determines how to use those facts to solve problems. Horn clauses are used to specify relations, with the consequent stating what is true if the conjunction of antecedents are true. Queries in Prolog can ask if a specific tuple belongs to a relation or if there exists a value for a variable such that a clause is true.
MetiTarski: An Automatic Prover for Real-Valued Special FunctionsLawrence Paulson
This document describes MetiTarski, an automatic prover for statements involving special functions like sin, cos, ln, and exp. It combines the Metis resolution theorem prover with the QEPCAD decision procedure for real closed fields. MetiTarski works by replacing functions with rational function upper or lower bounds, reducing problems to decidable first-order logic over real numbers. It implements techniques like algebraic literal deletion, normalization, and dividing out products to guide the proof search. The system has proved several problems from applications like hybrid systems and has some limitations but shows promise in combining deduction with decision procedures.
GDSC SSN - solution Challenge : Fundamentals of Decision MakingGDSCSSN
This session aims to provide participants with a comprehensive understanding of decision-making fundamentals in AI/ML, covering key concepts like reinforcement learning, different representations, and an exploration of current state-of-the-art methodologies.
The document discusses the constraint satisfaction problem (CSP) and the dichotomy conjecture regarding the complexity of CSP instances. It provides definitions and examples of CSPs. It explains the role of polymorphisms in determining the complexity, identifying semilattice, majority and affine polymorphisms as "good". It outlines the dichotomy conjecture that CSPs are either solvable in polynomial time or NP-complete depending on the presence of certain types of local structure defined by polymorphisms. The document also discusses algorithms and results for various constraint languages.
This document discusses the existence and uniqueness of renormalized solutions to a nonlinear multivalued elliptic problem with homogeneous Neumann boundary conditions and L1 data. Specifically, it considers the problem β(u) - div a(x, Du) ∋ f in Ω, with a(x, Du).η = 0 on ∂Ω, where f is an L1 function. It provides definitions of renormalized solutions and entropy solutions. The main result is the existence and uniqueness of renormalized solutions to this problem, which is proved using a priori estimates and a compactness argument with doubling of variables.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
OpenID AuthZEN Interop Read Out - AuthorizationDavid Brossard
During Identiverse 2024 and EIC 2024, members of the OpenID AuthZEN WG got together and demoed their authorization endpoints conforming to the AuthZEN API
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Project Management Semester Long Project - Acuityjpupo2018
Acuity is an innovative learning app designed to transform the way you engage with knowledge. Powered by AI technology, Acuity takes complex topics and distills them into concise, interactive summaries that are easy to read & understand. Whether you're exploring the depths of quantum mechanics or seeking insight into historical events, Acuity provides the key information you need without the burden of lengthy texts.
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
Hak ontoforum
1. Reasoning and the Semantic Web
Hassan A¨t-Kaci
ı
ANR Chair of Excellence
´
Universite Claude Bernard Lyon 1
Constraint Event-Driven Automated Reasoning Project
C E D A R
2. Reasoning and the Semantic Web
Outline
◮ Constraint Logic Programming
◮ What is unification?
◮ Semantic Web objects
◮ Graphs as constraints
◮ OWL and DL-based reasoning
◮ Constraint-based Semantic Web
reasoning
◮ Recapitulation
1
3. Reasoning and the Semantic Web
Outline
◮ Constraint Logic Programming
◮ What is unification?
◮ Semantic Web objects
◮ Graphs as constraints
◮ OWL and DL-based reasoning
◮ Constraint-based Semantic Web
reasoning
◮ Recapitulation
2
4. Constraint Logic Programming
In Prolog seen as a CLP language, a clause such as:
append([],L,L).
append([H|T],L,[H|R]) :- append(T,L,R).
is construed as:
append(X1,X2,X3) :- true
| X1 = [],
X2 = L, X3 = L.
append(X1,X2,X3) :- append(X4,X5,X6)
| X1 = [H|T], X2 = L, X3 = [H|R],
X4 = T,
X5 = L, X6 = R.
3
5. Constraint Logic Programming Scheme
The CLP scheme requires a set R of relational symbols (or,
predicate symbols) and a constraint language L.
The constraint language L needs very little
—(not even syntax!):
◮ a set V of variables (denoted as capitalized X, Y, . . .);
◮ a set Φ of formulae (denoted φ, φ′, . . .) called constraints;
◮ a function VAR: Φ → V, giving for every constraint φ the
set VAR(φ) of variables constrained by φ;
◮ a family of interpretations A over some domain D A;
◮ a set VAL (A) of valuations—total functions α : V → D A.
4
6. Constraint Logic Programming Language
Given a set of relational symbols R (r, r1, . . .), a constraint
language L is extended into a language R(L) of constrained
relational clauses with:
◮ the set R(Φ) of formulae defined to include:
– all formulae φ in Φ, i.e., all L-constraints;
– all relational atoms r(X1, . . . , Xn),
where X1, . . . , Xn ∈ V are mutually distinct;
and closed under & (conjunction) and → (implication);
◮ extending an interpretation A of L by adding relations:
rA ⊆ DA × . . . × DA for each r ∈ R.
5
7. Constraint Logic Programming Clause
We define a CLP constrained definite clause in R(L) as:
r(X) ← r1(X1) & . . . & rm(Xm) [] φ,
where (0 ≤ m) and:
◮ r(X), r1(X1), . . . , rm(Xm) are relational atoms in R(L); and,
◮ φ is a constraint formula in L.
A constrained resolvent is a formula ̺ [] φ, where ̺ is
a (possibly empty) conjunction of relational atoms r(X1, . . . , Xn)—
its relational part—and φ is a (possibly empty) conjunction of Lconstraints—its constraint part.
6
8. Constraint Logic Programming Resolution
Constrained resolution is a reduction rule on resolvents that
gives a sound and complete interpreter for programs consisting of a set C of constrained definite R(L)-clauses.
The reduction of a constrained resolvent of the form:
B1 & . . . & r(X1, . . . , Xn) & . . . Bk [] φ
by the (renamed) program clause:
r(X1, . . . , Xn) ← A1 & . . . & Am [] φ′
is the new constrained resolvent of the form:
B1 & . . . & A1 & . . . & Am & . . . Bk [] φ & φ′.
7
9. Why Constraints?
Some important points:
◮ But... wait a minute: “Constraints are logical formulae—so
why not use only logic?”
Indeed, constraints are logical formulae—and that is good!
But such formulae as factors in a conjunction commute
with other factors, thus freeing operational scheduling of
resolvents.
◮ A constraint is a formula solvable by a specific solving
algorithm rather than general-purpose logic-programming
machinery.
◮ Better: constraint solving remembers proven facts (proof
memoizing).
Such are key points exploited in CLP!
8
10. Constraint Solving—Constraint Normalization
Constraint solving is conveniently specified using constraint
normalization rules, which are semantics-preserving syntaxdriven rewrite (meta-)rules.
Plotkin’s SOS notation:
(n) Rule Name
Prior Form
if Condition
Posterior Form
A normalization rule is said to be correct iff the prior form’s
denotation is equal to the posterior form’s whenever the side
condition holds.
9
11. Constraint Normalization—Declarative Coroutining
Normalizing a constraint yields a normal form: a constraint
formula that can’t be transformed by any normalization rule.
Such may be either the inconsistent constraint ⊥, or:
◮ a solved form—a normal form that can be immediately
deemed consistent; or,
◮ a residuated form—a normal form but not a solved form.
A residuated constraint is a suspended computation; shared
variables are inter-process communication channels: binding in one normalization process may trigger resumption of
another residuated normalization process.
Constraint residuation enables automatic coroutining!
10
12. Reasoning and the Semantic Web
Outline
◮ Constraint Logic Programming
◮ What is unification?
◮ Semantic Web objects
◮ Graphs as constraints
◮ OWL and DL-based reasoning
◮ Constraint-based Semantic Web
reasoning
◮ Recapitulation
11
13. What is unification?—First-order terms
The set TΣ,V of first-order terms is defined given:
◮ V a countable set of variables;
◮ Σn sets of constructors of arity n (n ≥ 0);
◮ Σ = ∪n≥0Σn the constructor signature.
Then, a first-order term (FOT) is either:
◮ a variable in V; or,
◮ an element of Σ0; or,
◮ an expression of the form f (t1, . . . , tn),
where n > 0, f ∈ Σn, and ti is a FOT, for all i ≥ 1.
Examples of FOTs:
X
a
f (g(X, a), Y, h(X))
(variables are capitalized as in Prolog).
12
14. What is unification?—Substitutions & instances
A variable substitution is a map σ : V → TΣ,V such that the
set {X ∈ V | σ(X) = X} is finite.
Given a substitution σ and a FOT t, the σ-instance of t is the
FOT:
if t = X ∈ V;
σ(X)
if t = a ∈ Σ0;
tσ = a
f (t1σ, . . . , tnσ) if t = f (t1, . . . , tn).
Unification is the process of solving an equation of the form:
.
t = t′
13
15. What is unification?—FOT equation solving
A solution, if one exists, is any substitution σ such that:
tσ = t′σ
If solutions exist, there is always a minimal solution (the
most general unifier ): mgu(t, t′).
where: “σ1 is more general than σ2”
iff ∃σ s.t. σ2 = σ1σ
Equation and solution example:
.
f (g(X, b), X, g(h(X), Y )) = f (g(U, U ), b, g(V, a))
.
.
.
.
X = b, Y = a, U = b, V = h(b)
14
16. What is unification?—Algorithms
FOT unification algorithms have been (re-)invented:
◮ J. Herbrand (PhD thesis—page 148, 1930)
◮ J.A. Robinson (JACM 1965)
◮ A. Martelli & U. Montanari (ACM TOPLAS 1982)
But, rather than a monolithic algorithm, FOT unification is
simply expressible as a set of syntax-driven commutative
and terminating constraint normalization rules!
15
17. What is unification?—Constraint normalization rules
(1) Substitute
.
φ & X =t
.
φ[X/t] & X = t
if X occurs in φ
(2) Decompose
.
φ & f (s1 , . . . , sn ) = f (t1, . . . , tn)
.
.
φ & s1 = t1 & . . . & sn = tn
if f ∈ Σn, (n ≥ 0)
(3) Fail
.
φ & f (s1 , . . . , sn ) = g(t1, . . . , tm)
⊥
if
f ∈ Σn, (n ≥ 0)
and g ∈ Σm, (m ≥ 0)
and m = n
16
18. What is unification?—Constraint normalization rules
(4) Flip
.
φ & t=X
.
φ & X=t
if
X∈V
and t ∈ V
(5) Erase
.
φ & t=t
φ
if t ∈ Σ0 ∪ V
(6) Cycle
.
φ & X=t
⊥
if
X∈V
and t ∈ V
and X occurs in t
17
19. Reasoning and the Semantic Web
Outline
◮ Constraint Logic Programming
◮ What is unification?
◮ Semantic Web objects
◮ Graphs as constraints
◮ OWL and DL-based reasoning
◮ Constraint-based Semantic Web
reasoning
◮ Recapitulation
18
20. Semantic Web objects—Objects are labelled graphs!
JohnDoe35
true
er
ot
isV
marriedPerson
a
c
first
name
fullName
j
last
d
ag
e
"John"
"Doe"
k
42
DoeResidence
spouse
spouse
ess
addr
e
streetAddress
f
marriedPerson
b
l
"Main Street"
m
"Sometown"
n
o
40
g
first
name
is
Vo
te
r
city
co
un
tr y
123
"USA"
add
res
s
e
ag
er
mb
nu
street
fullName
"Jane"
p
last
h
"Doe"
q
JaneDoe78
false
i
19
21. Semantic Web objects—Objects are labelled graphs!
JohnDoe35 : marriedPerson ( name
=> fullName
( first => "John"
, last => "Doe" )
=> 42
, age
, address => DoeResidence
, spouse => JaneDoe78
, isVoter => true
)
20
22. Semantic Web objects—Objects are labelled graphs!
JaneDoe78 : marriedPerson ( name
=> fullName
( first => "Jane"
, last => "Doe" )
=> 40
, age
, address => DoeResidence
, spouse => JohnDoe35
, isVoter => false
)
DoeResidence : streetAddress
(
,
,
,
)
number
street
city
country
=>
=>
=>
=>
123
"Main Street"
"Sometown"
"USA"
21
23. Semantic Web types—Types are labelled graphs!
M1
boolean
er
ot
V
is
c
first
name
marriedPerson
a
fullName
j
last
d
ag
e
string
string
k
int
ess
addr
e
R
spouse
spouse
er
mb
nu
t
stree
streetAddress
add
ress
f
e
ag
marriedPerson
b
l
string
m
city
co
un
tr y
string
n
string
o
int
g
first
name
is
Vo
te
r
int
fullName
string
p
last
h
string
q
M2
boolean
i
22
24. Semantic Web types—Types are labelled graphs!
M1 : marriedPerson ( name
=> fullName
( first => string
, last => string )
=> int
, age
, address => R
, spouse => M2
, isVoter => boolean
)
23
25. Semantic Web formalisms—Types are labelled graphs!
M2 : marriedPerson ( name
=> fullName
( first => string
, last => string )
, age
=> int
, address => R
, spouse => M1
, isVoter => boolean
)
R : streetAddress
(
,
,
,
)
number
street
city
country
=>
=>
=>
=>
int
string
string
string
24
26. Reasoning and the Semantic Web
Outline
◮ Constraint Logic Programming
◮ What is unification?
◮ Semantic Web objects
◮ Graphs as constraints
◮ OWL and DL-based reasoning
◮ Constraint-based Semantic Web
reasoning
◮ Recapitulation
25
28. Graphs as constraints—Motivation
◮ What: a formalism for representing objects that is:
intuitive (objects as labelled graphs), expressive (“real-life” data
models), formal (logical semantics), operational (executable), &
efficient (constraint-solving)
◮ Why? viz., ubiquitous use of labelled graphs to structure
information naturally as in:
– object-orientation, knowledge representation,
– databases, semi-structured data,
– natural language processing, graphical interfaces,
– concurrency and communication,
– XML, RDF, the “Semantic Web,” etc., ...
27
29. Graphs as constraints—History
Viewing graphs as constraints stems from the work of:
◮ Hassan A¨t-Kaci (since 1983)
ı
◮ Gert Smolka (since 1986)
◮ Andreas Podelski (since 1989)
¨
◮ Franz Baader, Rolf Backhofen, Jochen Dorre, Martin Emele,
Bernhard Nebel, Joachim Niehren, Ralf Treinen, Manfred
Schmidt-Schauß, Remi Zajac, . . .
28
30. Graphs as constraints—Inheritance as graph endomorphism
first
name
last
string
id
person
string
first
name
last
married person
string
id
spouse
string
spouse
last
married person
name
id
29
31. Graphs as constraints—Inheritance as graph endomorphism
first
name
last
string
id
person
string
first
name
last
married person
string
id
spouse
string
spouse
last
married person
name
id
30
32. Graphs as constraints—OSF term syntax
Let V be a countable set of variables, and S a lattice of sorts.
An OSF term is an expression of the form:
X : s(ℓ1 ⇒ t1, . . . , ℓn ⇒ tn)
where:
◮ X ∈ V is the root variable
◮ s ∈ S is the root sort
◮ n ≥ 0 (if n = 0, we write X : s)
◮ {ℓ1, . . . , ℓn } ⊆ F are features
◮ t1, . . . , tn are OSF terms
31
33. Graphs as constraints—OSF term syntax example
X : person(name ⇒ N : ⊤(f irst ⇒ F : string),
name ⇒ M : id(last ⇒ S : string),
spouse ⇒ P : person(name ⇒ I : id(last ⇒ S : ⊤),
spouse ⇒ X : ⊤)).
Lighter notation (showing only shared variables):
X : person(name ⇒ ⊤(f irst ⇒ string),
name ⇒ id(last ⇒ S : string),
spouse ⇒ person(name ⇒ id(last ⇒ S),
spouse ⇒ X)).
32
34. Graphs as constraints—OSF clause syntax
An OSF constraint is one of:
◮X :s
. ′
◮ X.ℓ = X
. ′
◮X =X
where X (X ′) is a variable (i.e., a node), s is a sort (i.e., a
node’s type), and ℓ is a feature (i.e., an arc).
An OSF clause is a conjunction of OSFconstraints—i.e., a
set of OSF constraints
φ1 & . . .
& φn
33
35. Graphs as constraints—From OSF terms to OSF clauses
An OSF term t = X : s(ℓ1 ⇒ t1, . . . , ℓn ⇒ tn) is dissolved
into an OSF clause φ(t) as follows:
ϕ(t) = X : s
=
DEF
&
&
.
X.ℓ1 = X1
ϕ(t1)
&
&
...
...
&
&
.
X.ℓn = Xn
ϕ(tn)
where X1, . . . , Xn are the root variables of t1, . . . , tn.
34
36. Graphs as constraints—Example of OSF term dissolution
t = X : person(name ⇒ N : ⊤(f irst ⇒ F : string),
name ⇒ M : id(last ⇒ S : string),
spouse ⇒ P : person(name ⇒ I : id(last ⇒ S : ⊤),
spouse ⇒ X : ⊤))
ϕ(t) = X : person &
&
&
&
&
&
&
&
.
X . name = N
.
X . name = M
.
X . spouse = P
.
N . f irst = F
.
M . last
= S
.
P . name = I
.
I . last
= S
.
P . spouse = X
& N: ⊤
& M : id
& P : person
& F : string
& S : string
& I : id
& S:⊤
& X: ⊤
35
37. Graphs as constraints—Basic OSF term normalization
(1) Sort Intersection
φ & X : s & X : s′
φ & X : s ∧ s′
(2) Inconsistent Sort
(3) Variable Elimination
.
φ & X = X′
if
X = X′
.
φ[X ′/X] & X = X ′ and X ∈ Var(φ)
φ & X:⊥
(4) Feature Functionality
.
.
φ & X.ℓ = X ′ & X.ℓ = X ′′
X:⊥
.
.
φ & X.ℓ = X ′ & X ′ = X ′′
36
38. Graphs as constraints—OSF unification as OSF constraint normalization
person
employee
student
faculty
staff
intern
bob
piotr
pablo simon elena
art
judy
don john
sheila
37
39. Graphs as constraints—OSF unification as OSF constraint normalization
X : student
(roommate => person(rep => E : employee),
advisor => don(secretary => E))
&
Y : employee
(advisor => don(assistant => A),
roommate => S : student(rep => S),
helper
=> simon(spouse => A))
&
X = Y
38
40. Graphs as constraints—OSF unification as OSF constraint normalization
X : intern
(roommate => S : intern(rep => S),
advisor => don(assistant => A,
secretary => S),
helper => simon(spouse => A))
&
X = Y
&
E = S
39
41. Graphs as constraints—Extended OSF terms
Basic OSF terms may be extended to express:
◮ Non-lattice sort signatures
◮ Disjunction
◮ Negation
◮ Partial features
◮ Extensional sorts (i.e., denoting elements)
◮ Relational features (a.k.a., “roles”)
`
◮ Aggregates (a la monoid comprehensions)
◮ Regular-expression feature paths
◮ Sort definitions (a.k.a., “OSF theories”—“ontologies”)
40
42. Order-sorted featured graph constraints—(Summary)
We have overviewed a formalism of objects where:
◮ “real-life” objects are viewed as logical constraints
◮ objects may be approximated as set-denoting constructs
◮ object normalization rules provide an efficient operational
semantics
◮ consistency extends unification (and thus matching)
◮ this enables rule-based computation (whether rewrite or
logical rules) over general graph-based objects
◮ this yield a powerful means for effectively using ontologies
41
43. Reasoning and the Semantic Web
Outline
◮ Constraint Logic Programming
◮ What is unification?
◮ Semantic Web objects
◮ Graphs as constraints
◮ OWL and DL-based reasoning
◮ Constraint-based Semantic Web
reasoning
◮ Recapitulation
42
44. Semantic Web formalisms—OWL speaks
What language(s) do OWL’s speak?—a confusing growing
crowd of strange-sounding languages and logics:
• OWL, OWL Lite, OWL DL, OWL Full
• DL, DLR, . . .
• AL, ALC, ALCN , ALCN R, . . .
• SHIF , SHIN , CIQ, SHIQ, SHOQ(D), SHOIQ, SRIQ,
SROIQ, . . .
Depending on whether the system allows:
• concepts, roles (inversion, composition, inclusion, . . . )
• individuals, datatypes, cardinality constraints
• various combination thereof
43
45. Semantic Web formalisms—DL dialects
For better or worse, the W3C has married its efforts to DLbased reasoning systems:
◮ All the proposed DL Knowledge Base formalisms in the
OWL family use tableaux-based methods for reasoning
◮ Tableaux methods work by building models explicitly via
formula expansion rules
◮ This limits DL reasoning to finite (i.e., decidable) models
◮ Worse, tableaux methods only work for small ontologies:
they fail to scale up to large ontologies
44
46. Semantic Web formalisms—DL dialects
Tableaux style DL reasoning (ALCN R)
C ONJUNCTIVE C ONCEPT:
E XISTENTIAL R OLE :
S
if x : (C1 ⊓ C2 ) ∈ S
and {x : C1 , x : C2 } ⊆ S
S ∪ {x : C1, x : C2}
D ISJUNCTIVE C ONCEPT:
if
and
and
DEF
x : (∃R.C) ∈ S s.t. R =
=
z : C ∈ S ⇒ z ∈ RS [x]
y is new
m
i=1
Ri
S ∪ {xRiy}m ∪ {y : C}
i=1
M IN C ARDINALITY:
S
if x : (C1 ⊔ C2 ) ∈ S
and x : Ci ∈ S (i = 1, 2)
S
S ∪ {x : Ci}
if
and
and
DEF
x : (≥ n.R) ∈ S s.t. R =
=
|RS [x]| = n
yi is new (0 ≤ i ≤ n)
m
i=1
Ri
S
S ∪ {xRiyj }m,n
. i,j=1,1
∪ {yi = yj }1≤i<j≤n
U NIVERSAL R OLE :
if
and
and
x : (∀R.C) ∈ S
y ∈ RS [x]
y:C ∈ S
S
S ∪ {y : C}
M AX C ARDINALITY:
if
and
and
x : (≤ n.R) ∈ S
|RS [x]| > n and y, z ∈ RS [x]
.
y=z ∈ S
S
S ∪ S[y/z]
45
47. Understanding OWL speak—OSF vs. DL
Understanding OWL amounts to reasoning with knowledge
expressed as OWL sentences. Its DL semantics relies on
explicitly building models using induction.
ergo:
Inductive techniques are eager and (thus) wasteful
Reasoning with knowledge expressed as constrained (OSF)
graphs relies on implicitly pruning inconsistent elements using coinduction.
ergo:
Coinductive techniques are lazy and (thus) thrifty
46
48. Reasoning and the Semantic Web
Outline
◮ Constraint Logic Programming
◮ What is unification?
◮ Semantic Web objects
◮ Graphs as constraints
◮ OWL and DL-based reasoning
◮ Constraint-based Semantic Web
reasoning
◮ Recapitulation
47
49. LIF E—Rules + constraints for Semantic Web reasoning
LIFE—Logic, Inheritance, Functions, and Equations
CLP(χ)—Constraint, Logic, Programming, parameterized over
is a constraint system χ
LIFE is a CLP system over OSF constraints and functions
over them (rewrite rules); namely:
LIFE = CLP(OSF + FP)
48
50. LIF E—Rules + constraints for Semantic Web reasoning
adultPerson
employee
richemployee
marriedPerson
marriedEmployee
A multiple-inheritance hierarchy
49
51. The same hierarchy in Java
interface adultPerson {
name id;
date dob;
int age;
String ssn;
}
interface employee extends adultPerson {
title position;
String institution;
employee supervisor;
int salary;
}
interface marriedPerson extends adultPerson {
marriedPerson spouse;
}
interface marriedEmployee extends employee, marriedPerson {
}
interface richEmployee extends employee {
}
50
52. The same hierarchy in LIF E
employee <: adultPerson.
marriedPerson <: adultPerson.
richEmployee <: employee.
marriedEmployee <: employee.
marriedEmployee <: marriedPerson.
:: adultPerson
(
,
,
,
id ⇒ name
dob ⇒ date
age ⇒ int
ssn ⇒ string ).
:: employee
(
,
,
,
position ⇒ title
institution ⇒ string
supervisor ⇒ employee
salary ⇒ int ).
:: marriedPerson ( spouse ⇒ marriedPerson ).
51
53. A relationally and functionally constrained LIF E sort hierarchy
:: P : adultPerson (
,
,
,
id ⇒ name
dob ⇒ date
age ⇒ A : int
ssn ⇒ string )
| A = ageInYears(P ), A ≥ 18.
:: employee
(
,
,
,
position ⇒ T : title
institution ⇒ string
supervisor ⇒ E : employee
salary ⇒ S : int )
| higherRank(E.position, T ) , E.salary ≥ S.
52
54. A relationally and functionally constrained LIF E sort hierarchy
:: M : marriedPerson ( spouse ⇒ P : marriedPerson )
| P.spouse = M.
:: R : richEmployee
( institution ⇒ I
, salary ⇒ S )
| stockValue(I) = V , hasShares(R, I, N ) , S + N ∗ V ≥ 200000.
53
55. Proof “memoizing”
OSF constraints as syntactic variants of logical formulae:
Sorts are unary predicates:
X : s ⇐⇒ [ s]]([[X]])
.
Features are unary functions: X.f = Y ⇐⇒ [ f ] ([[X]]) = [ Y ]
Coreferences are equations:
.
X = Y ⇐⇒ [ X]] = [ Y ]
So . . .
Why not use (good old) logic proofs instead?
54
56. Proof “memoizing”
But:
model equivalence = proof equivalence!
◮ OSF-unification proves sort constraints by reducing them
monotonically w.r.t. the sort ordering
◮ ergo, once X : s has been proven, the proof of s(X) is
recorded as the sort “s” itself!
◮ if further down a proof, it is again needed to prove X : s, it
is remembered as X’s binding
◮ Indeed, OSF constraint proof rules ensure that:
no type constraint is ever proved twice
55
57. Proof “memoizing”
OSF type constraints are incrementally “memoized” as they
are verified:
sorts act as (instantaneous!) proof caches!
whereas in logic having proven s(X) is not “remembered” in
any way (e.g., Prolog)
Example: consider the OSF constraint conjunction:
• X : adultPerson(age ⇒ 25),
• X : employee,
• X : marriedPerson(spouse ⇒ Y ).
Notation: type#(condition) means “constraint condition
attached to sort type”
56
59. Proof “memoizing”
1. proving: X : adultPerson(age ⇒ 25) . . .
2. proving: adultPerson#(X.age ≥ 18) . . .
3. proving: X : employee . . .
4. proving: employee#(higherRank(E.position, P )) . . .
5. proving: employee#(E.salary ≥ S) . . .
6. proving: X : marriedPerson(spouse ⇒ Y ) . . .
7. proving: X : marriedEmployee(spouse ⇒ Y ) . . .
8. proving: marriedEmployee#(Y.spouse = X) . . .
Therefore, all other inherited conditions coming from a
sort greater than marriedEmployee (such as employee or
adultPerson) can be safely ignored!
58
60. Proof “memoizing”
This “memoizing” property of OSF constraint-solving enables:
using rules over ontologies
as well as, conversely ,
enhancing ontologies with rules
Indeed, with OSF:
◮ concept ontologies may be used as constraints by
rules for inference and computation
◮ rule-based conditions in concept definitions may be
used to magnify expressivity of ontologies thanks to the
proof-memoizing property of ordered sorts
59
61. Reasoning and the Semantic Web
Outline
◮ Constraint Logic Programming
◮ What is unification?
◮ Semantic Web objects
◮ Graphs as constraints
◮ OWL and DL-based reasoning
◮ Constraint-based Semantic Web
reasoning
◮ Recapitulation
60
62. Recapitulation—what you must remember from this talk. . .
◮ Objects are graphs
◮ Graphs are constraints
◮ Constraints are good: they provide both formal theory
and efficient processing
◮ Formal Logic is not all there is
◮ even so: model theory = proof theory
◮ indeed, due to its youth, much of W3C technology is often
na¨ve in conception and design
ı
Ergo. . . it is condemned to reinventing [square!] wheels
as long as it does not realize that such issues have been
studied in depth for the past 50 years in theoretical CS!
61
63. Recapitulation—what you must remember from this talk. . . (ctd)
Pending issues re. “ontological programming”
◮ Syntax:
– What’s essential?
– What’s superfluous?
Confusing notation : XML-based cluttered verbosity
ok, not for human consumption—but still!
◮ Semantics:
– What’s a model good for?
– What’s (efficiently) provable?
– decidable = efficient
– undecidable = inefficient
◮ Applications, maintenance, evolution, etc., ...
◮ Many, many, publications... but no (real) field testing as
yet!
62
64. Recapitulation—what you must remember from this talk. . . (ctd)
Proposal: take heed of the following facts:
◮ Linked data represents all information as interconnected
sorted labelled RDF graphs—it has become a universal
de facto knowledge model standard
◮ Differences between DL and OSF can come handy:
– DL is expansive—therefore, expensive—and can only
describe finitely computable sets; whereas,
– OSF is contractive—therefore, efficient—and can also
describe recursively-enumerable sets
◮ CLP-based graph unification reasoning = practical KR:
– structural: objects, classes, inheritance
– non-structural: path equations, relational constraints,
type definitions
63
65. Innovation takes courage. . . (from Martin Wildberger’s “Smarter Planet” Keynote, CASCON 2009)
If I’d asked my customers what they wanted,
they’d have said a faster horse!—Henry Ford
64
66. Thank You For Your Attention !
For more information:
hak@acm.org
http://cs.brown.edu/people/pvh/CPL/Papers/v1/hak.pdf
http://cedar.liris.cnrs.fr
C E D A R
65