Analogy is one of the most studied representatives of a family of non-classical forms of reasoning working across different domains, usually taken to play a crucial role in creative thought and problem-solving. In the first part of the talk, I will shortly introduce general principles of computational analogy models (relying on a generalization-based approach to analogy-making). We will then have a closer look at Heuristic-Driven Theory Projection (HDTP) as an example for a theoretical framework and implemented system: HDTP computes analogical relations and inferences for domains which are represented using many-sorted first-order logic languages, applying a restricted form of higher-order anti-unification for finding shared structural elements common to both domains. The presentation of the framework will be followed by a few reflections on the "cognitive plausibility" of the approach motivated by theoretical complexity and tractability considerations.
In the second part of the talk I will discuss an application of HDTP to modeling essential parts of concept blending processes as current "hot topic" in Cognitive Science. Here, I will sketch an analogy-inspired formal account of concept blending —developed in the European FP7-funded Concept Invention Theory (COINVENT) project— combining HDTP with mechanisms from Case-Based Reasoning.
This document discusses logics of context and modal type theories. It begins by providing some background and caveats. It then presents a motivating example about reasoning about claims within a report. The document discusses tasks involving contextual structure and reasoning across contexts. It advocates for using proof theory and natural deduction systems when designing logics of context. It presents some approaches to modeling contexts and modality, including McCarthy's original ideas. It discusses properties that are important for logics of context, such as normalization. It provides overviews of some existing logics of context and compares their properties and limitations.
Lecture 2: From Semantics To Semantic-Oriented ApplicationsMarina Santini
From the "Natural Language Processing" LinkedIn group:
John Kontos, Professor of Artificial Intelligence
I wonder whether translating into formal logic is nothing more than transliteration which simply isolates the part of the text that can be reasoned upon using the simple inference mechanism of formal logic. The real problem I think lies with the part of text that CANNOT be translated one the one hand and the one that changes its meaning due to civilization advances. My own proposal is to leave NL text alone and try building inference mechanisms for the UNTRANSLATED text depending on the task requirements.
All the best
John"
Extending the knowledge level of cognitive architectures with Conceptual Spac...Antonio Lieto
Extending the knowledge level of cognitive architectures with Conceptual Spaces (+ a case study with Dual-PECCS: a hybrid knowledge representation system for common sense reasoning). Talk given at Stockholm, September 2016.
The document discusses categorical semantics for explicit substitutions. It begins by motivating the need for categorical semantics of syntactic calculi to provide mathematical models and ensure correctness. It then discusses different categorical structures that can provide semantics for calculi with explicit substitutions, including indexed categories, context-handling categories, and E-categories/L-categories. These categorical models impose equations on explicit substitutions that correspond to the intended behavior. The document also discusses how additional type structures like functions, tensors, and the exponential/bang type can be modeled using these categorical structures. Overall, the document advocates for the use of category theory to guide the design of calculi with explicit substitutions and ensure their semantics are well-behaved.
This document discusses the computation of presuppositions and entailments from natural language text. It begins by defining presuppositions and entailments, and explaining how they can be computed using tree transformations on semantic representations. The paper then provides examples of elementary presuppositions and entailments. It describes a system that computes presuppositions and entailments while parsing sentences using an augmented transition network. The system applies tree transformations specified in the lexicon to the semantic representation to derive inferences. The paper concludes that presuppositions and entailments exhibit computational properties not shown by the general class of inferences, such as being tied to the semantic and syntactic structure of language.
How the philosophy of mathematical practice can be logic by other means (bris...Brendan Larvor
The document discusses the author's view that informal proofs in mathematics depend on both logical form and content. The author argues that logic should be understood as the study of inferential actions, which can incorporate content and representations. This broader view of logic facilitates connecting logical questions about rigor to the study of mathematical cultures and practices, since logical constraints are enacted as cultural norms. The author claims this approach is needed to address shortcomings in using formal logic to model mathematical proof and to utilize studies of specific mathematical practices.
The document discusses the history and development of ontologies. It begins with definitions of key terms like ontology, vocabulary, and taxonomy. It then provides a brief history of ontologies dating back to ancient Greek philosophers. The document also discusses how ontologies are used in computer science to formally represent domain knowledge. It provides examples of ontologies in fields like medicine, commerce, and the semantic web. Finally, it discusses best practices for building ontologies, such as reusing existing terms and collaborating with domain experts and end users.
Supermathematics and Artificial General IntelligenceJordan Bennett
In a clear way, I outline how Supermathematics may apply in Artificial General Intelligence.
I describe standard Super-Hamiltonian usage, with respect to Dwave's "Quantum Boltzmann Machine".
This document discusses logics of context and modal type theories. It begins by providing some background and caveats. It then presents a motivating example about reasoning about claims within a report. The document discusses tasks involving contextual structure and reasoning across contexts. It advocates for using proof theory and natural deduction systems when designing logics of context. It presents some approaches to modeling contexts and modality, including McCarthy's original ideas. It discusses properties that are important for logics of context, such as normalization. It provides overviews of some existing logics of context and compares their properties and limitations.
Lecture 2: From Semantics To Semantic-Oriented ApplicationsMarina Santini
From the "Natural Language Processing" LinkedIn group:
John Kontos, Professor of Artificial Intelligence
I wonder whether translating into formal logic is nothing more than transliteration which simply isolates the part of the text that can be reasoned upon using the simple inference mechanism of formal logic. The real problem I think lies with the part of text that CANNOT be translated one the one hand and the one that changes its meaning due to civilization advances. My own proposal is to leave NL text alone and try building inference mechanisms for the UNTRANSLATED text depending on the task requirements.
All the best
John"
Extending the knowledge level of cognitive architectures with Conceptual Spac...Antonio Lieto
Extending the knowledge level of cognitive architectures with Conceptual Spaces (+ a case study with Dual-PECCS: a hybrid knowledge representation system for common sense reasoning). Talk given at Stockholm, September 2016.
The document discusses categorical semantics for explicit substitutions. It begins by motivating the need for categorical semantics of syntactic calculi to provide mathematical models and ensure correctness. It then discusses different categorical structures that can provide semantics for calculi with explicit substitutions, including indexed categories, context-handling categories, and E-categories/L-categories. These categorical models impose equations on explicit substitutions that correspond to the intended behavior. The document also discusses how additional type structures like functions, tensors, and the exponential/bang type can be modeled using these categorical structures. Overall, the document advocates for the use of category theory to guide the design of calculi with explicit substitutions and ensure their semantics are well-behaved.
This document discusses the computation of presuppositions and entailments from natural language text. It begins by defining presuppositions and entailments, and explaining how they can be computed using tree transformations on semantic representations. The paper then provides examples of elementary presuppositions and entailments. It describes a system that computes presuppositions and entailments while parsing sentences using an augmented transition network. The system applies tree transformations specified in the lexicon to the semantic representation to derive inferences. The paper concludes that presuppositions and entailments exhibit computational properties not shown by the general class of inferences, such as being tied to the semantic and syntactic structure of language.
How the philosophy of mathematical practice can be logic by other means (bris...Brendan Larvor
The document discusses the author's view that informal proofs in mathematics depend on both logical form and content. The author argues that logic should be understood as the study of inferential actions, which can incorporate content and representations. This broader view of logic facilitates connecting logical questions about rigor to the study of mathematical cultures and practices, since logical constraints are enacted as cultural norms. The author claims this approach is needed to address shortcomings in using formal logic to model mathematical proof and to utilize studies of specific mathematical practices.
The document discusses the history and development of ontologies. It begins with definitions of key terms like ontology, vocabulary, and taxonomy. It then provides a brief history of ontologies dating back to ancient Greek philosophers. The document also discusses how ontologies are used in computer science to formally represent domain knowledge. It provides examples of ontologies in fields like medicine, commerce, and the semantic web. Finally, it discusses best practices for building ontologies, such as reusing existing terms and collaborating with domain experts and end users.
Supermathematics and Artificial General IntelligenceJordan Bennett
In a clear way, I outline how Supermathematics may apply in Artificial General Intelligence.
I describe standard Super-Hamiltonian usage, with respect to Dwave's "Quantum Boltzmann Machine".
1. The document discusses the need for a positive account of informal proof in mathematics, as most mathematical proofs are informal. It argues against the view that informal proofs are recipes for formal derivations.
2. The document proposes that logic should be understood more broadly as the general study of inferential actions, as informal proofs often involve actions on mathematical objects beyond propositions. Examples of such actions include diagram manipulation in Euclidean geometry.
3. The document reviews work that may support this broader view of logic in informal proofs, such as studies of reasoning with diagrams in knot theory and using Cayley graphs to prove group theory results.
How to Ground A Language for Legal Discourse In a Prototypical Perceptual Sem...L. Thorne McCarty
Slides for my talk at the 15th International Conference on Artificial Intelligence and Law (ICAIL 2015), June 11, 2015.
The full ICAIL 2015 paper is available on ResearchGate at bit.ly/1qCnLJq.
The document discusses constructive description logics and provides three options for constructing description logics constructively:
1) Translating description logic syntax into intuitionistic first-order logic (IFOL) to obtain the logic IALC.
2) Translating description logic syntax into intuitionistic modal logic (IK) to obtain the logic iALC.
3) Translating description logic syntax into constructive modal logic (CK) to obtain the logic cALC.
The talk outlines the translation approaches and discusses some pros and cons of the different constructive description logics, but notes that the work is preliminary and more criteria are needed to identify the best constructive system(s).
Conceptual Spaces for Cognitive Architectures: A Lingua Franca for Different ...Antonio Lieto
We claim that Conceptual Spaces offer a lingua franca that allows to unify and generalize many aspects of the symbolic, sub-symbolic and diagrammatic approaches (by overcoming some of their typical problems) and to integrate them on a common ground. In doing so we extend and detail some of the arguments explored by Gardenfors [23] for defending the need of a conceptual, intermediate, representation level between
the symbolic and the sub-symbolic one. Additionally, we argue that Conceptual Spaces could offer a unifying framework for interpreting many kinds of diagrammatic and analogical representations. As a consequence, their adoption could also favor the integration of diagrammatical representation and
reasoning in Cognitive Architectures
An Approach to Automated Learning of Conceptual Graphs from TextFulvio Rotella
Many document collections are private and accessible only by selected people. Especially in business realities, such collections need to be managed, and the use of an external taxonomic or ontological resource would be very useful. Unfortunately, very often domain-specific resources are not available, and the development of techniques that do not rely on external resources becomes essential.
Automated learning of conceptual graphs from restricted collections needs to be robust with respect to missing or partial knowledge, that does not allow to extract a full conceptual graph and only provides sparse fragments thereof. This work proposes a way to deal with these problems applying relational clustering and generalization methods. While clustering collects similar concepts, generalization provides additional nodes that can bridge separate pieces of the graph while expressing it at a higher level of abstraction. In this process, considering relational information allows a broader perspective in the similarity assessment for clustering, and ensures more flexible and understandable descriptions of the generalized concepts. The final conceptual graph can be used for better analyzing and understanding the collection, and for performing some kind of reasoning on it.
This document provides an introduction and overview of 5 papers related to topic modeling techniques. It begins with introducing the speaker and their research interests in text analysis using topic modeling. It then lists the 5 papers that will be discussed: LSA, pLSI, LDA, Gaussian LDA, and criticisms of topic modeling. The document focuses on summarizing each paper's motivation, key points, model, parameter estimation methods, and deficiencies. It provides high-level summaries of key aspects of influential topic modeling papers to introduce the topic.
Rethinking Critical Editions of Fragments by OntologiesMatteo Romanello
This document discusses rethinking the representation of fragmentary classical texts in digital editions through the use of ontologies. It addresses problems with current editions, such as duplication of text. The authors analyze the domain to identify concepts like fragments as interpretations linked to evidence. They design an ontology with classes for interpretations, textual passages, and linking fragments to witness texts. The benefits cited include a solid architecture separating texts from interpretations, formalization of the domain, and improved data interoperability.
The spread and abundance of electronic documents requires automatic techniques for extracting useful information from the text they contain. The availability of conceptual taxonomies can be of great help, but manually building them is a complex and costly task. Building on previous work, we propose a technique to automatically extract conceptual graphs from text and reason with them. Since automated learning of taxonomies needs to be robust with respect to missing or partial knowledge and flexible with respect to noise, this work proposes a way to deal with these problems. The case of poor data/sparse concepts is tackled by finding generalizations among disjoint pieces of knowledge. Noise is
handled by introducing soft relationships among concepts rather than hard ones, and applying a probabilistic inferential setting. In particular, we propose to reason on the extracted graph using different kinds of relationships among concepts, where each arc/relationship is associated to a number that represents its likelihood among all possible worlds, and to face the problem of sparse knowledge by using generalizations among distant concepts as bridges between disjoint portions of knowledge.
Introduction to Distributional SemanticsAndre Freitas
This document provides an introduction to distributional semantics. It discusses how distributional semantic models (DSMs) represent word meanings as vectors based on their linguistic contexts in large corpora. This distributional hypothesis states that words that appear in similar contexts tend to have similar meanings. The document outlines how DSMs are built, important parameters like context type and weighting, and examples like latent semantic analysis. It also discusses how DSMs can support applications like semantic search. Finally, it introduces how compositional semantics explores representing the meanings of phrases and sentences compositionally based on the meanings of their parts.
Combining text and pattern preprocessing in an adaptive dna pattern matcherIAEME Publication
This paper presents an adaptive DNA pattern matching algorithm that combines both text and pattern preprocessing to efficiently detect patterns in DNA sequences. It initializes a pattern preprocessor similar to KMP and builds a suffix tree for the text. It then marks regions in the text and finds the maximum overlap between each region and the pattern. The algorithm searches the region with highest overlap first before moving to other regions. Experiments show it performs faster than KMP, with running time of O(m) where m is the pattern length.
This document discusses using hybrid logics to model contexts in textual inference logic (TIL). It considers representing contexts as modal operators or nominal operators (@) from hybrid logic. Specifically, it explores using intuitionistic hybrid logic (IHL) to represent temporal contexts with @ operators and other contexts as modal boxes. However, it notes that semanticists may not view contexts as modalities. It also considers experiments building a constructive hybrid logic and using a hybrid logic with only nominals and satisfaction for distributed reasoning, but leaves the experiments for future work.
This document provides an introduction to grammars and languages from a computer science perspective. It defines language as an infinite set of sentences, where each sentence is a finite sequence of symbols from a finite alphabet. Grammars are described as a means to generate and describe the structure of the sentences in a language. The document outlines different views of language from communication, linguistics, and computer science to establish the terminology and scope used.
The document proposes new ideograms (symbols representing concepts) for concepts in physics. It argues ideograms have advantages over words for comprehension and recall. It then introduces a LaTeX package containing ideograms for concepts like electron, photon, proton, etc. designed to be intuitive. The ideograms aim to make physics equations easier to understand and more elegant. A list of the proposed ideograms is included.
ESR10 Joachim Daiber - EXPERT Summer School - Malaga 2015RIILP
The document discusses using syntactic preordering models to delimit the morphosyntactic search space for machine translation of morphologically rich languages. It explores preordering dependency trees of the source language to reduce word order variations and predicting morphological attributes on the source side to inform target language word selection. Experimental results show that non-local features and jointly learning which attributes to predict can improve translation performance over baselines. The work aims to combine preordering and morphology prediction to better exploit interactions between syntactic structure and inflectional properties.
The document discusses first-order logic. It begins by providing an example of expressing statements about Socrates being human and humans being mortal in first-order logic using predicates, variables, quantifiers and constants. It then defines the key components of first-order logic syntax, including predicates, functions, variables, terms, and well-formed formulas. It also discusses semantics, including vocabularies, structures, interpretations, and how structures can satisfy formulas. Finally, it provides an example of applying first-order logic to represent an artificial intelligence problem involving an agent navigating a grid to find gold while avoiding dangers.
A Constructive Mathematics approach for NL formal grammarsFederico Gobbo
This document discusses formalizing natural language grammars through adpositional grammars (AdGrams). AdGrams are based on cognitive linguistics concepts of trajector and landmark. The document proposes that natural language structure can be expressed through a triple involving a governor, dependent, and their relation. It provides examples analyzing phrases and sentences as either dependency-based or government-based structures based on whether the dependent or governor is the trajector. The goal of AdGrams is to formalize natural language grammars in a way that is informed by cognitive linguistics concepts and can be computationally analyzed.
This document discusses the concept of an ecumenical logic system that allows both classical and intuitionistic reasoning to coexist. It summarizes Dag Prawitz's approach to defining such a system, which uses different symbols for logical constants that have different meanings classically versus intuitionistically. However, the document raises the question of why Prawitz's system only includes one symbol for negation rather than separate classical and intuitionistic negation symbols. Possible answers discussed include the interderivability of the two notions of negation and the view that negation asserts a contradiction from assuming the negated proposition. The document does not conclude there is a definitive answer and suggests this as an interesting open problem area.
Annotating Rhetorical and Argumentative Structures in Mathematical KnowledgeChristoph Lange
This document summarizes the work Christoph Lange did at DERI from April to October 2008. It discusses Lange's background in mathematical knowledge management and his project using semantic web technologies like ontologies and annotation. At DERI, Lange learned about engineering ontologies for scientific documents and user interfaces for annotating and browsing knowledge. He expanded his ontologies to model the rhetorical and document structures of mathematical texts.
Classical logic has a serious limitation in that it cannot cope with the issues of vagueness and uncertainty
into which fall most modes of human reasoning. In order to provide a foundation for human knowledge
representation and reasoning in the presence of vagueness, imprecision, and uncertainty, fuzzy logic
should have the ability to deal with linguistic hedges, which play a very important role in the modification
of fuzzy predicates. In this paper, we extend fuzzy logic in narrow sense with graded syntax, introduced by
Nova´k et al., with many hedge connectives. In one case, each hedge does not have any dual one. In the
other case, each hedge can have its own dual one. The resulting logics are shown to also have the Pavelkastyle
completeness.
A Natural Logic for Artificial Intelligence, and its Risks and Benefits gerogepatton
This paper is a multidisciplinary project proposal, submitted in the hopes that it may garner enough interest to launch it with members of the AI research community along with linguists
and philosophers of mind and language interested in constructing a semantics for a natural logic for AI. The paper outlines some of the major hurdles in the way of “semantics-driven” natural language processing based on standard predicate logic and sketches out the steps to be
taken toward a “natural logic”, a semantic system explicitly defined on a well-regimented (but indefinitely expandable) fragment of a natural language that can, therefore, be “intelligently” processed by computers, using the semantic representations of the phrases of the fragment.
Here are the proofs using mathematical induction for the two assignments:
1.3
Base Case: When n = 1, LHS = 12 = 1(1+1)(2(1)+1)/6 = RHS.
Inductive Hypothesis: Assume the formula holds for n = k.
LHS = 12 + 22 + ... + k2 = k(k+1)(2k+1)/6
Inductive Step: For n = k + 1,
LHS = 12 + 22 + ... + k2 + (k+1)2
= k(k+1)(2k+1)/6 + (k+1)2
= (k
Functional and Structural Models of Commonsense Reasoning in Cognitive Archit...Antonio Lieto
The document provides an overview of functional and structural models of commonsense reasoning in cognitive architectures. It discusses several approaches to commonsense reasoning including semantic networks, frames, scripts, and default logic. It also discusses different levels of representation including conceptual spaces, typicality, and compositionality. The document proposes dual process models that integrate heterogeneous representations like prototypes and exemplars. It presents computational models like Dual PECCS and TCL that implement aspects of commonsense reasoning through integrated and connected representations.
1. The document discusses the need for a positive account of informal proof in mathematics, as most mathematical proofs are informal. It argues against the view that informal proofs are recipes for formal derivations.
2. The document proposes that logic should be understood more broadly as the general study of inferential actions, as informal proofs often involve actions on mathematical objects beyond propositions. Examples of such actions include diagram manipulation in Euclidean geometry.
3. The document reviews work that may support this broader view of logic in informal proofs, such as studies of reasoning with diagrams in knot theory and using Cayley graphs to prove group theory results.
How to Ground A Language for Legal Discourse In a Prototypical Perceptual Sem...L. Thorne McCarty
Slides for my talk at the 15th International Conference on Artificial Intelligence and Law (ICAIL 2015), June 11, 2015.
The full ICAIL 2015 paper is available on ResearchGate at bit.ly/1qCnLJq.
The document discusses constructive description logics and provides three options for constructing description logics constructively:
1) Translating description logic syntax into intuitionistic first-order logic (IFOL) to obtain the logic IALC.
2) Translating description logic syntax into intuitionistic modal logic (IK) to obtain the logic iALC.
3) Translating description logic syntax into constructive modal logic (CK) to obtain the logic cALC.
The talk outlines the translation approaches and discusses some pros and cons of the different constructive description logics, but notes that the work is preliminary and more criteria are needed to identify the best constructive system(s).
Conceptual Spaces for Cognitive Architectures: A Lingua Franca for Different ...Antonio Lieto
We claim that Conceptual Spaces offer a lingua franca that allows to unify and generalize many aspects of the symbolic, sub-symbolic and diagrammatic approaches (by overcoming some of their typical problems) and to integrate them on a common ground. In doing so we extend and detail some of the arguments explored by Gardenfors [23] for defending the need of a conceptual, intermediate, representation level between
the symbolic and the sub-symbolic one. Additionally, we argue that Conceptual Spaces could offer a unifying framework for interpreting many kinds of diagrammatic and analogical representations. As a consequence, their adoption could also favor the integration of diagrammatical representation and
reasoning in Cognitive Architectures
An Approach to Automated Learning of Conceptual Graphs from TextFulvio Rotella
Many document collections are private and accessible only by selected people. Especially in business realities, such collections need to be managed, and the use of an external taxonomic or ontological resource would be very useful. Unfortunately, very often domain-specific resources are not available, and the development of techniques that do not rely on external resources becomes essential.
Automated learning of conceptual graphs from restricted collections needs to be robust with respect to missing or partial knowledge, that does not allow to extract a full conceptual graph and only provides sparse fragments thereof. This work proposes a way to deal with these problems applying relational clustering and generalization methods. While clustering collects similar concepts, generalization provides additional nodes that can bridge separate pieces of the graph while expressing it at a higher level of abstraction. In this process, considering relational information allows a broader perspective in the similarity assessment for clustering, and ensures more flexible and understandable descriptions of the generalized concepts. The final conceptual graph can be used for better analyzing and understanding the collection, and for performing some kind of reasoning on it.
This document provides an introduction and overview of 5 papers related to topic modeling techniques. It begins with introducing the speaker and their research interests in text analysis using topic modeling. It then lists the 5 papers that will be discussed: LSA, pLSI, LDA, Gaussian LDA, and criticisms of topic modeling. The document focuses on summarizing each paper's motivation, key points, model, parameter estimation methods, and deficiencies. It provides high-level summaries of key aspects of influential topic modeling papers to introduce the topic.
Rethinking Critical Editions of Fragments by OntologiesMatteo Romanello
This document discusses rethinking the representation of fragmentary classical texts in digital editions through the use of ontologies. It addresses problems with current editions, such as duplication of text. The authors analyze the domain to identify concepts like fragments as interpretations linked to evidence. They design an ontology with classes for interpretations, textual passages, and linking fragments to witness texts. The benefits cited include a solid architecture separating texts from interpretations, formalization of the domain, and improved data interoperability.
The spread and abundance of electronic documents requires automatic techniques for extracting useful information from the text they contain. The availability of conceptual taxonomies can be of great help, but manually building them is a complex and costly task. Building on previous work, we propose a technique to automatically extract conceptual graphs from text and reason with them. Since automated learning of taxonomies needs to be robust with respect to missing or partial knowledge and flexible with respect to noise, this work proposes a way to deal with these problems. The case of poor data/sparse concepts is tackled by finding generalizations among disjoint pieces of knowledge. Noise is
handled by introducing soft relationships among concepts rather than hard ones, and applying a probabilistic inferential setting. In particular, we propose to reason on the extracted graph using different kinds of relationships among concepts, where each arc/relationship is associated to a number that represents its likelihood among all possible worlds, and to face the problem of sparse knowledge by using generalizations among distant concepts as bridges between disjoint portions of knowledge.
Introduction to Distributional SemanticsAndre Freitas
This document provides an introduction to distributional semantics. It discusses how distributional semantic models (DSMs) represent word meanings as vectors based on their linguistic contexts in large corpora. This distributional hypothesis states that words that appear in similar contexts tend to have similar meanings. The document outlines how DSMs are built, important parameters like context type and weighting, and examples like latent semantic analysis. It also discusses how DSMs can support applications like semantic search. Finally, it introduces how compositional semantics explores representing the meanings of phrases and sentences compositionally based on the meanings of their parts.
Combining text and pattern preprocessing in an adaptive dna pattern matcherIAEME Publication
This paper presents an adaptive DNA pattern matching algorithm that combines both text and pattern preprocessing to efficiently detect patterns in DNA sequences. It initializes a pattern preprocessor similar to KMP and builds a suffix tree for the text. It then marks regions in the text and finds the maximum overlap between each region and the pattern. The algorithm searches the region with highest overlap first before moving to other regions. Experiments show it performs faster than KMP, with running time of O(m) where m is the pattern length.
This document discusses using hybrid logics to model contexts in textual inference logic (TIL). It considers representing contexts as modal operators or nominal operators (@) from hybrid logic. Specifically, it explores using intuitionistic hybrid logic (IHL) to represent temporal contexts with @ operators and other contexts as modal boxes. However, it notes that semanticists may not view contexts as modalities. It also considers experiments building a constructive hybrid logic and using a hybrid logic with only nominals and satisfaction for distributed reasoning, but leaves the experiments for future work.
This document provides an introduction to grammars and languages from a computer science perspective. It defines language as an infinite set of sentences, where each sentence is a finite sequence of symbols from a finite alphabet. Grammars are described as a means to generate and describe the structure of the sentences in a language. The document outlines different views of language from communication, linguistics, and computer science to establish the terminology and scope used.
The document proposes new ideograms (symbols representing concepts) for concepts in physics. It argues ideograms have advantages over words for comprehension and recall. It then introduces a LaTeX package containing ideograms for concepts like electron, photon, proton, etc. designed to be intuitive. The ideograms aim to make physics equations easier to understand and more elegant. A list of the proposed ideograms is included.
ESR10 Joachim Daiber - EXPERT Summer School - Malaga 2015RIILP
The document discusses using syntactic preordering models to delimit the morphosyntactic search space for machine translation of morphologically rich languages. It explores preordering dependency trees of the source language to reduce word order variations and predicting morphological attributes on the source side to inform target language word selection. Experimental results show that non-local features and jointly learning which attributes to predict can improve translation performance over baselines. The work aims to combine preordering and morphology prediction to better exploit interactions between syntactic structure and inflectional properties.
The document discusses first-order logic. It begins by providing an example of expressing statements about Socrates being human and humans being mortal in first-order logic using predicates, variables, quantifiers and constants. It then defines the key components of first-order logic syntax, including predicates, functions, variables, terms, and well-formed formulas. It also discusses semantics, including vocabularies, structures, interpretations, and how structures can satisfy formulas. Finally, it provides an example of applying first-order logic to represent an artificial intelligence problem involving an agent navigating a grid to find gold while avoiding dangers.
A Constructive Mathematics approach for NL formal grammarsFederico Gobbo
This document discusses formalizing natural language grammars through adpositional grammars (AdGrams). AdGrams are based on cognitive linguistics concepts of trajector and landmark. The document proposes that natural language structure can be expressed through a triple involving a governor, dependent, and their relation. It provides examples analyzing phrases and sentences as either dependency-based or government-based structures based on whether the dependent or governor is the trajector. The goal of AdGrams is to formalize natural language grammars in a way that is informed by cognitive linguistics concepts and can be computationally analyzed.
This document discusses the concept of an ecumenical logic system that allows both classical and intuitionistic reasoning to coexist. It summarizes Dag Prawitz's approach to defining such a system, which uses different symbols for logical constants that have different meanings classically versus intuitionistically. However, the document raises the question of why Prawitz's system only includes one symbol for negation rather than separate classical and intuitionistic negation symbols. Possible answers discussed include the interderivability of the two notions of negation and the view that negation asserts a contradiction from assuming the negated proposition. The document does not conclude there is a definitive answer and suggests this as an interesting open problem area.
Annotating Rhetorical and Argumentative Structures in Mathematical KnowledgeChristoph Lange
This document summarizes the work Christoph Lange did at DERI from April to October 2008. It discusses Lange's background in mathematical knowledge management and his project using semantic web technologies like ontologies and annotation. At DERI, Lange learned about engineering ontologies for scientific documents and user interfaces for annotating and browsing knowledge. He expanded his ontologies to model the rhetorical and document structures of mathematical texts.
Classical logic has a serious limitation in that it cannot cope with the issues of vagueness and uncertainty
into which fall most modes of human reasoning. In order to provide a foundation for human knowledge
representation and reasoning in the presence of vagueness, imprecision, and uncertainty, fuzzy logic
should have the ability to deal with linguistic hedges, which play a very important role in the modification
of fuzzy predicates. In this paper, we extend fuzzy logic in narrow sense with graded syntax, introduced by
Nova´k et al., with many hedge connectives. In one case, each hedge does not have any dual one. In the
other case, each hedge can have its own dual one. The resulting logics are shown to also have the Pavelkastyle
completeness.
A Natural Logic for Artificial Intelligence, and its Risks and Benefits gerogepatton
This paper is a multidisciplinary project proposal, submitted in the hopes that it may garner enough interest to launch it with members of the AI research community along with linguists
and philosophers of mind and language interested in constructing a semantics for a natural logic for AI. The paper outlines some of the major hurdles in the way of “semantics-driven” natural language processing based on standard predicate logic and sketches out the steps to be
taken toward a “natural logic”, a semantic system explicitly defined on a well-regimented (but indefinitely expandable) fragment of a natural language that can, therefore, be “intelligently” processed by computers, using the semantic representations of the phrases of the fragment.
Here are the proofs using mathematical induction for the two assignments:
1.3
Base Case: When n = 1, LHS = 12 = 1(1+1)(2(1)+1)/6 = RHS.
Inductive Hypothesis: Assume the formula holds for n = k.
LHS = 12 + 22 + ... + k2 = k(k+1)(2k+1)/6
Inductive Step: For n = k + 1,
LHS = 12 + 22 + ... + k2 + (k+1)2
= k(k+1)(2k+1)/6 + (k+1)2
= (k
Functional and Structural Models of Commonsense Reasoning in Cognitive Archit...Antonio Lieto
The document provides an overview of functional and structural models of commonsense reasoning in cognitive architectures. It discusses several approaches to commonsense reasoning including semantic networks, frames, scripts, and default logic. It also discusses different levels of representation including conceptual spaces, typicality, and compositionality. The document proposes dual process models that integrate heterogeneous representations like prototypes and exemplars. It presents computational models like Dual PECCS and TCL that implement aspects of commonsense reasoning through integrated and connected representations.
Multimodal Searching and Semantic Spaces: ...or how to find images of Dalmati...Jonathon Hare
Tutorial at the "Reality of the Semantic Gap in Image Retrieval" tutorial at the first international conference on Semantics And digital Media Technology (SAMT 2006). 6th December 2006.
The document discusses analogical reasoning and case-based reasoning. It provides an overview of research in these areas including structure mapping theory, models of analogical processing like SME and MAC/FAC, and case-based reasoning systems. It proposes an analogy ontology to integrate analogical processing and first-principles reasoning by providing a formal representation of analogy concepts and results.
Cognitive Agents with Commonsense - Invited Talk at Istituto Italiano di Tecn...Antonio Lieto
Cognitive Agents with Commonsense - Invited Talk at Istituto Italiano di Tecnologia (IIT), I-Cog Initiative. https://www.facebook.com/icog.initiative/posts/129265685733532
Discovering Novel Information with sentence Level clustering From Multi-docu...irjes
The document presents a novel fuzzy clustering algorithm called FRECCA that clusters sentences from multi-documents to discover new information. FRECCA uses fuzzy relational eigenvector centrality to calculate page rank scores for sentences within clusters, treating the scores as likelihoods. It uses expectation maximization to optimize cluster membership values and mixing coefficients without a parameterized likelihood function. An evaluation shows FRECCA achieves superior performance to other clustering algorithms on a quotations dataset, identifying overlapping clusters of semantically related sentences.
Formal and Computational Representations
The Semantics of First-Order Logic
Event Representations
Description Logics & the Web Ontology Language
Compositionality
Lamba calculus
Corpus-based approaches:
Latent Semantic Analysis
Topic models
Distributional Semantics
Kenneth Lloyd introduces category theory as a potential language for scientific discourse in agent-based modeling and simulation (ABMS). Category theory defines mathematical structures and relationships between them. Lloyd argues that agents can be considered as structures within a category. He provides an example of applying category theory concepts like functors to represent functional objects and agents. Finally, Lloyd discusses how category theory may provide a formalism for describing emergent properties in multi-agent systems and validating hypotheses through simulation.
This document contains summaries of several papers related to artificial intelligence and neural networks:
1. The first paper discusses using recurrent neural networks to plan robot motions in variable environments.
2. The second paper describes using neural network models to classify brain signals and mind states from EEG data.
3. The third paper proposes using coarticulation composite models in acoustic-phonetic decoding to improve recognition rates beyond phonemes, diphones, and triphones.
Data science is an area at the interface of statistics, computer science, and mathematics.
• Statisticians contributed a large inferential framework, important Bayesian perspectives, the bootstrap and CART and random forests, and the concepts of sparsity and parsimony.
• Computer scientists contributed an appetite for big, challenging problems.They also pioneered neural networks, boosting, PAC bounds, and developed programming languages, such as Spark and hadoop, for handling Big Data.
• Mathematicians contributed support vector machines, modern optimization, tensor analysis, and (maybe) topological data analysis.
Ontology Learning from Text
Ontology construction ‘Layer Cake’
Knowledge representation and knowledge management systems
Subtasks in ontology learning
Most Popular Ontology Learning Tools
Lecture slides by Mustafa Jarrar at Birzeit University, Palestine.
See the course webpage at: http://jarrar-courses.blogspot.com/2011/09/knowledgeengineering-fall2011.html
and http://www.jarrar.info
and on Youtube:
http://www.youtube.com/watch?v=3_-HGnI6AZ0&list=PLDEA50C29F3D28257
Introduction to complexity theory that solves your assignment problem it contains about complexity class,deterministic class,big- O notation ,proof by mathematical induction, L-Space ,N-Space and characteristics functions of set and so on
This document discusses different views on pragmatic rationality and the interfaces between pragmatics, semantics, and other domains like neurology and cognitive science. It summarizes debates around Gricean rationality and different theories like Relevance Theory that propose alternative models of pragmatic reasoning and inference. Experimental evidence from psycholinguistics is also discussed regarding how it relates to and constrains theoretical models of pragmatic processing.
This document provides an overview of automated theorem proving. It discusses:
1) The history and background of automated theorem proving, from Hobbes and Leibniz proposing algorithmic logic to modern computer-based approaches.
2) The theoretical limitations of automated reasoning due to results like Godel's incompleteness theorems, but also practical applications like verifying mathematics and computer systems.
3) How automated reasoning involves expressing statements formally and then manipulating those expressions algorithmically, as anticipated by Leibniz centuries ago.
Ted Willke - The Brain’s Guide to Dealing with Context in Language UnderstandingMLconf
The Brain’s Guide to Dealing with Context in Language Understanding
Like the visual cortex, the regions of the brain involved in understanding language represent information hierarchically. But whereas the visual cortex organizes things into a spatial hierarchy, the language regions encode information into a hierarchy of timescale. This organization is key to our uniquely human ability to integrate semantic information across narratives. More and more, deep learning-based approaches to natural language understanding embrace models that incorporate contextual information at varying timescales. This has not only led to state-of-the art performance on many difficult natural language tasks, but also to breakthroughs in our understanding of brain activity.
In this talk, we will discuss the important connection between language understanding and context at different timescales. We will explore how different deep learning architectures capture timescales in language and how closely their encodings mimic the brain. Along the way, we will uncover some surprising discoveries about what depth does and doesn’t buy you in deep recurrent neural networks. And we’ll describe a new, more flexible way to think about these architectures and ease design space exploration. Finally, we’ll discuss some of the exciting applications made possible by these breakthroughs.
A Simple Introduction to Neural Information RetrievalBhaskar Mitra
Neural Information Retrieval (or neural IR) is the application of shallow or deep neural networks to IR tasks. In this lecture, we will cover some of the fundamentals of neural representation learning for text retrieval. We will also discuss some of the recent advances in the applications of deep neural architectures to retrieval tasks.
(These slides were presented at a lecture as part of the Information Retrieval and Data Mining course taught at UCL.)
This document provides an introduction and overview of CS344: Introduction to Artificial Intelligence course at IIT Bombay. The key points are:
- The course will be taught 3 times a week by Dr. Pushpak Bhattacharyya and TAs. Topics will include search, logic, knowledge representation, neural networks, computer vision, and planning.
- Foundational concepts in AI that will be covered include the Church-Turing hypothesis, Turing machines, the physical symbol system hypothesis, and limits of computability and automation.
- Fuzzy logic will be introduced as a way to model human reasoning with imprecise information using linguistic variables and fuzzy set theory.
Similar to Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies and Concept Blending - Tarek R. Besold (20)
"El álgebra lineal es una herramienta fundamental en muchos campos de la ciencia y la tecnología. Es particularmente importante en la física, la ingeniería, la informática y la estadística. La capacidad de manipular eficientemente grandes cantidades de datos y matrices complejas es esencial en estas áreas para la resolución de problemas y la toma de decisiones.
A priori, puede dar la sensación de que estamos muy lejos del uso del álgebra lineal en nuestro día a día. Sin embargo, algunas técnicas como la descomposición en valores singulares y la regresión lineal para entrenar modelos y hacer predicciones precisas están detrás de la inteligencia artificial y el aprendizaje automático. ¿Te suena ChatGPT? Puede no parecerlo, pero el álgebra lineal también está detrás en algunos de sus procesos. Por este motivo, debemos seguir trabajando en este campo, ya que su importancia seguirá creciendo a medida que se generen y analicen grandes cantidades de datos en el mundo actual.
"
La pandemia de COVID-19 ha supuesto una proliferación de mapas y contramapas. Por ello, organizaciones de la sociedad civil y movimientos sociales han generado sus propias interpretaciones y representaciones de los datos sobre la crisis. Estos también han contribuido a visibilizar aspectos, sujetos y temas que han sido desatendidos o infrarrepresentados en las visualizaciones hegemónicas y dominantes. En este contexto, la presente ponencia se centra en el análisis de los imaginarios sociales relacionados con la elaboración de mapas durante la pandemia. Es decir, trata de indagar en la importancia de los mapas para el activismo digital, las potencialidades que se extraen de esta tecnología y los valores asociados a las visualizaciones creadas con ellos. El objetivo último es reflexionar sobre la vía emergente del activismo de datos, así como sobre la intersección entre los imaginarios sociales y la geografía digital.
Designing RISC-V-based Accelerators for next generation Computers (DRAC) is a 3-year project (2019-2022) funded by the ERDF Operational Program of Catalonia 2014-2020. DRAC will design, verify, implement and fabricate a high performance general purpose processor that will incorporate different accelerators based on the RISC-V technology, with specific applications in the field of post-quantum security, genomics and autonomous navigation. In this talk, we will provide an overview of the main achievements in the DRAC project, including the fabrication of Lagarto, the first RISC-V processor developed in Spain.
This talk will begin introducing the uElectronics section of ESA at ESTEC and the general activities the group is responsible for. Then, it will go through some of the R+D on-going activities that the group is involved with, hand in hand with universities and/or companies. One of the major ones is related to the European rad-hard FPGAs that have been partially founded by ESA for several years and that will be playing a major role in the sector in the upcoming years. It´s also worth talking about the RTL soft IPs that are currently under development and that will allow us to keep on providing the European ecosystem with some key capabilities. The latter will be an overview of RISC-V space hardened on-going activities that might be replacing the current SPARC based processors available for our missions.
El objetivo de esta charla es presentar las últimas novedades incorporadas en la arquitectura ARM y describir las tendencias en la microarquitectura de los procesadores con arquitectura ARM. ARM es una empresa relativamente pequeña en comparación con otros gigantes del sector tecnológico. Sin embargo, la amplia implantación de su arquitectura, siendo ampliamente dominante en algunos sectores, y sus microarquitecturas, hacen que la tecnología ARM ocupe un lugar central en el desarrollo tecnológico del mundo actual. La tecnología ARM está presente prácticamente en todo el espectro tecnológico, desde los dispositivos más sencillos hasta el HPC y Cloud computing, pasando por los smartphones, automoción electrónica de consumo, etc
"Formal verification has been used by computer scientists for decades to prevent
software bugs. However, with a few exceptions, it has not been used by researchers
working in most areas of mathematics (geometry, algebra, analysis, etc.). In this
talk, we will discuss how this has changed in the past few years, and the possible
implications to the future of mathematical research, teaching and communication.
We will focus on the theorem prover Lean and its mathematical library
mathlib, since this is currently the system most widely used by mathematicians.
Lean is a functional programming language and interactive theorem prover based
on dependent type theory, with proof irrelevance and non-cumulative universes.
The mathlib library, open-source and designed as a basis for research level
mathematics, is one of the largest collections of formalized mathematics. It allows
classical reasoning, uses large- and small-scale automation, and is characterized
by its decentralized nature with over 200 contributors, including both computer
scientists and mathematicians."
"Part of the research community thinks that it is still early to tackle the development of quantum software engineering techniques. The reason is that how the quantum computers of the future will look like is still unknown. However, there are some facts that we can affirm today: 1) quantum and classical computers will coexist, each dedicated to the tasks at which they are most efficient. 2) quantum computers will be part of the cloud infrastructure and will be accessible through the Internet. 3) complex software systems will be made up of smaller pieces that will collaborate with each other. 4) some of those pieces will be quantum, therefore the systems of the future will be hybrid. 5) the coexistence and interaction between the components of said hybrid systems will be supported by service composition: quantum services.
This talk analyzes the challenges that the integration of quantum services poses to Service Oriented Computing."
In this talk, after a brief overview of AI concepts in particular Machine Learning (ML) techniques, some of the well-known computer design concepts for high performance and power efficiency are presented. Subsequently, those techniques that have had a promising impact for computing ML algorithms are discussed. Deep learning has emerged as a game changer for many applications in various fields of engineering and medical sciences. Although the primary computation function is matrix vector multiplication, many competing efficient implementations of this primary function have been proposed and put into practice. This talk will review and compare some of those techniques that are used for ML computer design.
Tras una breve introducción a la informática médica y unas pinceladas sobre conceptos prácticos de Inteligencia Artificial (posible definición consensuada, strong VS weak AI y técnicas y métodos comúnmente empleados), el bloque central de la charla muestra ejemplos prácticos (en forma de casos de éxito) de distintos desarrollos llevados a cabo por el grupo de Sistemas Informáticos de Nueva Generación (SING: http//sing-group.org/) en los ámbitos de (i) Informática clínica (InNoCBR, PolyDeep), (ii) Informática para investigación clínica (PathJam, WhichGenes), (iii) bioinformática traslacional (Genómica: ALTER, Proteómica: DPD, BI, BS, Mlibrary, Mass-Up, e integración de datos ÓMICOS: PunDrugs) y (iv) Informática en salud pública (CURMIS4th). Finalmente, se comenta brevemente la importancia que se espera tenga en un futuro inmediato la IA interpretable (XAI, Explainable Artificial Intelligence) y la participación humana (HITL. Human-In-The-Loop). La charla termina con una breve reflexión sobre las lecciones aprendidas por el ponente después de más de 16 años de desarrollo de sistemas inteligentes en el ámbito de la informática médica.
Many emerging applications require methods tailored towards high-speed data acquisition and filtering of streaming data followed by offline event reconstruction and analysis. In this case, the main objective is to relieve the immense pressure on the storage and communication resources within the experimental infrastructure. In other applications, ultra low latency real time analysis is required for autonomous experimental systems and anomaly detection in acquired scientific data in the absence of any prior data model for unknown events. At these data rates, traditional computing approaches cannot carry out even cursory analyses in a time frame necessary to guide experimentation. In this talk, Prof. Ogrenci will present some examples of AI hardware architectures. She will discuss the concept of co-design, which makes the unique needs of an application domain transparent to the hardware design process and present examples from three applications: (1) An in-pixel AI chip built using the HLS methodology; (2) A radiation hardened ASIC chip for quantum systems; (3) An FPGA-based edge computing controller for real-time control of a High Energy Physics experiment.
En esta conferencia se presentará una revisión del concepto de autonomía para robots móviles de campo y la identificación de desafíos para lograr un verdadero sistema autónomo, además de sugerir posibles direcciones de investigación. Los sistemas robóticos inteligentes, por lo general, obtienen conocimiento de sus funciones y del entorno de trabajo en etapa de diseño y desarrollo. Este enfoque no siempre es eficiente, especialmente en entornos semiestructurados y complejos como puede ser el campo de cultivo. Un sistema robótico verdaderamente autónomo debería desarrollar habilidades que le permitan tener éxito en tales entornos sin la necesidad de tener a-priori un conocimiento ontológico del área de trabajo y la definición de un conjunto de tareas o comportamientos predefinidos. Por lo que en esta conferencia se presentarán posibles estrategias basadas en Inteligencia Artificial que permitan perfeccionar las capacidades de navegación de robots móviles y que sean capaces de ofrecer un nivel de autonomía lo suficientemente elevado para poder ejecutar todas las tareas dentro de una misión casa-a-casa (home-to-home).
Quantum computing has become a noteworthy topic in academia and industry. The multinational companies in the world have been obtaining impressive advances in all areas of quantum technology during the last two decades. These companies try to construct real quantum computers in order to exploit their theoretical preferences over today’s classical computers in practical applications. However, they are challenging to build a full-scale quantum computer because of their increased susceptibility to errors due to decoherence and other quantum noise. Therefore, quantum error correction (QEC) and fault-tolerance protocol will be essential for running quantum algorithms on large-scale quantum computers.
The overall effect of noise is modeled in terms of a set of Pauli operators and the identity acting on the physical qubits (bit flip, phase flip and a combination of bit and phase flips). In addition to Pauli errors, there is another error named leakage errors that occur when a qubit leaves the defined computational subspace. As the location of leakage errors is unknown, these can damage even more the quantum computations. Thus, this talk will briefly provide quantum error models.
Los chatbots son un elemento clave en la transformación digital de nuestra sociedad. Están por todas partes: eCommerce, salud digital, asistencia a clientes, turismo,... Pero si habéis usado alguno, probablemente os habrá decepcionado. Lo confieso, la mayoría de los chatbots que existen son muy malos. Y es que no es nada fácil hacer un chatbot que sea realmente útil e inteligente. Un chatbot combina toda la complejidad de la ingeniería de software con la del procesamiento de lenguaje natural. Pensad que muchos chatbots hay que desplegarlos en varios canales (web, telegram, slack,...) y a menudo tienen que utilizar APIs y servicios externos, acceder a bases de datos internas o integrar modelos de lenguaje preentrenados (por ej. detectores de toxicidad), etc. Y el problema no es sólo crear el bot, si no también probarlo y evolucionarlo. En esta charla veremos los mayores desafíos a los que hay que enfrentarse cuando nos encargan un proyecto de desarrollo que incluye un chatbot y qué técnicas y estrategias podemos ir aplicando en función de las necesidades del proyecto, para conseguir, esta vez sí un chatbot que sepa de lo que habla.
Many HPC applications are massively parallel and can benefit from the spatial parallelism offered by reconfigurable logic. While modern memory technologies can offer high bandwidth, designers must craft advanced communication and memory architectures for efficient data movement and on-chip storage. Addressing these challenges requires to combine compiler optimizations, high-level synthesis, and hardware design.
In this talk, I will present challenges, solutions, and trends for generating massively parallel accelerators on FPGA for high-performance computing. These architectures can provide performance comparable to software implementations on high-end processors, and much higher energy efficiency thanks to logic customization.
The main challenge of concurrent software verification has always been in achieving modularity, i.e., the ability to divide and conquer the correctness proofs with the goal of scaling the verification effort. Types are a formal method well-known for its ability to modularize programs, and in the case of dependent types, the ability to modularize and scale complex mathematical proofs.
In this talk I will present our recent work towards reconciling dependent types with shared memory concurrency, with the goal of achieving modular proofs for the latter. Applying the type-theoretic paradigm to concurrency has lead us to view separation logic as a type theory of state, and has motivated novel abstractions for expressing concurrency proofs based on the algebraic structure of a resource and on structure-preserving functions (i.e., morphisms) between resources.
Microarchitectural attacks, such as Spectre and Meltdown, are a class of
security threats that affect almost all modern processors. These attacks exploit the side-effects resulting from processor optimizations to leak sensitive information and compromise a system’s security.
Over the years, a large number of hardware and software mechanisms for
preventing microarchitectural leaks have been proposed. Intuitively, more
defensive mechanisms are less efficient, while more permissive mechanisms may offer more performance but require more defensive programming. Unfortunately, there are no
hardware-software contracts that would turn this intuition into a basis for
principled co-design.
In this talk, we present a framework for specifying hardware/software security
contracts, an abstraction that captures a processor’s security guarantees in a
simple, mechanism-independent manner by specifying which program executions a
microarchitectural attacker can distinguish.
La aparición de vulnerabilidades por la falta de controles de seguridad es una de las causas por las que se demandan nuevos marcos de trabajo que produzcan software seguro de forma predeterminada. En la conferencia se abordará cómo transformar el proceso de desarrollo de software dando la importancia que merece la seguridad desde el inicio del ciclo de vida. Para ello se propone un nuevo modelo de desarrollo – modelo Viewnext-UEx – que incorpora prácticas de seguridad de forma preventiva y sistemática en todas las fases del proceso de ciclo de vida del software. El propósito de este nuevo modelo es anticipar la detección de vulnerabilidades aplicando la seguridad desde las fases más tempranas, a la vez que se optimizan los procesos de construcción del software. Se exponen los resultados de un escenario preventivo, tras la aplicación del modelo Viewnext-UEx, frente al escenario reactivo tradicional de aplicar la seguridad a partir de la fase de testing.
This document discusses trusting artificial intelligence systems. It begins with an overview of trust in social and computing contexts. It then discusses artificial intelligence, including machine learning, deep learning, and natural language processing. It details how AI systems can be attacked, including adversarial inputs, data poisoning, and model stealing. It raises important discussions around using AI in contexts like cybersecurity, medicine, transportation, and sentiment analysis, and the challenges of ensuring systems can be trusted.
El uso de energías renovables es clave para cumplir los objetivos de desarrollo sostenible de la Agenda 2030. Entre estas energías, la eólica es la segunda más utilizada debido a su alta eficiencia. Algunos estudios sugieren que la energía eólica será la principal fuente de generación en 2050. Por ello es conveniente seguir investigando en la aplicación de técnicas de control avanzadas en estos sistemas.
Entre estas técnicas avanzadas cabe destacar las redes neuronales y el aprendizaje por refuerzo combinadas con estrategias clásicas de control. Estas técnicas ya se han empleado con éxito en el modelado y el control de sistemas complejos.
Esta conferencia presentará la aplicación de redes neuronales y aprendizaje por refuerzo al control de aerogeneradores, centrándolo especialmente en el control de pitch. Se detallarán diferentes configuraciones con redes neuronales y otras técnicas aplicadas al control de pitch. Finalmente se propondrán algunas técnicas híbridas que combinen lógica difusa, tablas de búsqueda y redes neuronales, mostrando resultados que han permitido probar su utilidad para mejorar la eficiencia de las turbinas eólicas.
As the world's energy demand rises, so does the amount of renewable energy, particularly wind energy, in the supply. The life cycle of wind farms starting from manufacturing the components to decommission stage involve significant involvement of cost and the application of AI and data analytics are on reducing these costs are limited. With this conference talk, the audience expected to know some of the interesting applications of AI and data analytics on offshore wind. And, also highlight the future challenges and opportunities. This conference could be useful for students, academics and researcher who want to make next career in offshore wind but yet know where to start.
HEAP SORT ILLUSTRATED WITH HEAPIFY, BUILD HEAP FOR DYNAMIC ARRAYS.
Heap sort is a comparison-based sorting technique based on Binary Heap data structure. It is similar to the selection sort where we first find the minimum element and place the minimum element at the beginning. Repeat the same process for the remaining elements.
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
Introduction- e - waste – definition - sources of e-waste– hazardous substances in e-waste - effects of e-waste on environment and human health- need for e-waste management– e-waste handling rules - waste minimization techniques for managing e-waste – recycling of e-waste - disposal treatment methods of e- waste – mechanism of extraction of precious metal from leaching solution-global Scenario of E-waste – E-waste in India- case studies.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
We have compiled the most important slides from each speaker's presentation. This year’s compilation, available for free, captures the key insights and contributions shared during the DfMAy 2024 conference.
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELgerogepatton
As digital technology becomes more deeply embedded in power systems, protecting the communication
networks of Smart Grids (SG) has emerged as a critical concern. Distributed Network Protocol 3 (DNP3)
represents a multi-tiered application layer protocol extensively utilized in Supervisory Control and Data
Acquisition (SCADA)-based smart grids to facilitate real-time data gathering and control functionalities.
Robust Intrusion Detection Systems (IDS) are necessary for early threat detection and mitigation because
of the interconnection of these networks, which makes them vulnerable to a variety of cyberattacks. To
solve this issue, this paper develops a hybrid Deep Learning (DL) model specifically designed for intrusion
detection in smart grids. The proposed approach is a combination of the Convolutional Neural Network
(CNN) and the Long-Short-Term Memory algorithms (LSTM). We employed a recent intrusion detection
dataset (DNP3), which focuses on unauthorized commands and Denial of Service (DoS) cyberattacks, to
train and test our model. The results of our experiments show that our CNN-LSTM method is much better
at finding smart grid intrusions than other deep learning algorithms used for classification. In addition,
our proposed approach improves accuracy, precision, recall, and F1 score, achieving a high detection
accuracy rate of 99.50%.
Like Alice in Wonderland: Unraveling Reasoning and Cognition Using Analogies and Concept Blending - Tarek R. Besold
1. Like Alice in Wonderland:
Unraveling Reasoning and Cognition Using
Analogies and Concept Blending
Tarek R. Besold
KRDB, Faculty of Computer Science, Free University of Bozen-Bolzano
16. June 2016
Tarek R. Besold Computational Models of Analogy and Concept Blending
2. Honour to whom honour is due...
The following is joint work with many people, most notably:
Robert Robere, Department of Computer Science, University of
Toronto (Canada).
Enric Plaza, IIIA-CSIC, Barcelona (Spain).
Kai-Uwe K¨uhnberger, Institute of Cognitive Science, University
of Osnabr¨uck (Germany).
Tarek R. Besold Computational Models of Analogy and Concept Blending
3. And Who is Paying (Some of) the Bills?
The described work on concept blending has been conducted as part
of the European FP7 Concept Invention Theory (COINVENT) project
(FET-Open grant number 611553).
Consortium members are:
Free University of Bozen-Bolzano (S¨udtirol-Alto Adige, Italy)
University of Osnabr¨uck (Germany)
University of Magdeburg (Germany)
University of Dundee (Scotland, UK)
University of Edinburgh (Scotland, UK)
Goldsmiths, University of London (UK)
IIIA-CSIC, Barcelona (Catalunya, Spain)
Aristotle University of Thessaloniki (Greece)
Tarek R. Besold Computational Models of Analogy and Concept Blending
5. Back in the Day (1)
Tarek R. Besold Computational Models of Analogy and Concept Blending
6. Back in the Day (2)
Tarek R. Besold Computational Models of Analogy and Concept Blending
7. Back in the Day (3)
Rutherford analogy (underlying the Bohr-Rutherford model of the
atom):
Analogy between solar system and
hydrogen atom:
...nucleus is more massive than electrons,
sun is more massive than planets.
...nucleus attracts electrons (Coulomb’s
law), sun attracts planets (Newton’s law of
gravity).
...attraction plus mass relation causes
electrons to revolve around nucleus,
similarly planets revolve around sun.
Tarek R. Besold Computational Models of Analogy and Concept Blending
9. Intermezzo: Analogy (1)
Analogy
“ànalog–a” - analogia, “proportion”.
Informally: Claims of similarity, often used in argumentation or
when explaining complex situations.
A bit more formal: Analogy-making is the human ability of
perceiving dissimilar domains as similar with respect to
certain aspects based on shared commonalities in relational
structure or appearance.
(Incidental remark: In less complex forms also to be found in
some other primates.)
Tarek R. Besold Computational Models of Analogy and Concept Blending
12. Heuristic-Driven Theory Projection (1)
Heuristic-Driven Theory Projection (HDTP)
Computing analogical relations and inferences (domains given
as many-sorted first-order logic representation/many-sorted term
algebras) using a generalisation-based approach.
Base and target of analogy defined in terms of axiomatisations,
i.e., given by a finite set of formulae.
Aligning pairs of formulae by means of anti-unification
(extending classical Plotkin-style first-order anti-unification to a
restricted form of higher-order anti-unification).
Proof-of-concept applications in modelling mathematical
reasoning and concept blending in mathematics.
Tarek R. Besold Computational Models of Analogy and Concept Blending
13. Heuristic-Driven Theory Projection (2)
Figure: Analogy-making in HDTP.
Tarek R. Besold Computational Models of Analogy and Concept Blending
14. Heuristic-Driven Theory Projection (3)
Anti-Unification
Dual to the unification problem (see, e.g., logic programming or
automated theorem proving).
Generalising terms in a meaningful way, yielding for each term
an anti-instance (distinct subterms replaced by variables).
Goal: Finding the most specific anti-unifier.
Plotkin: For a proper definition of generalisation, for a given pair
of terms there always is exactly one least general generalisation
(up to renaming of variables).
Problem: Structural commonalities embedded in different
contexts possibly not accessible by first-order anti-unification.
Tarek R. Besold Computational Models of Analogy and Concept Blending
15. Heuristic-Driven Theory Projection (4)
Restricted Higher-Order Anti-Unification
First-order terms extended by introducing variables taking
arguments (first-order variables become variables with arity 0),
making a term either a first-order or a higher-order term.
Class of substitutions restricted to (compositions of) the
following four cases:
1 Renamings rF,F⇤
: F(t1,...,tn)
rF,F⇤
! F⇤(t1,...,tn).
2 Fixations fF
c : F(t1,...,tn)
fF
f
! f(t1,...,tn).
3 Argument insertions iF,F⇤
G,i :
F(t1,...,tn)
i
F,F⇤
G,i
! F⇤(t1,...,ti ,G(ti+1,...,ti+k ),ti+k+1,...,tn).
4 Permutations pF,F⇤
a : F(t1,...,tn)
p
F,F⇤
a
! F⇤(ta(1),...,ta(n)).
Tarek R. Besold Computational Models of Analogy and Concept Blending
16. Heuristic-Driven Theory Projection (5)
Examples of higher-order anti-unifications:
Tarek R. Besold Computational Models of Analogy and Concept Blending
19. Complexity and Tractability in Cognitive Models and Systems
Tarek R. Besold Computational Models of Analogy and Concept Blending
20. Computer Metaphor and Church-Turing Thesis
Famous ideas at the heart of many endeavours in computational
cognitive modelling and/or AI:
1 “Computer metaphor” of the mind (i.e. the concept of a
computational theory of mind).
2 Church-Turing thesis.
1 Bridges gap between humans and computers:
Human mind and brain can be seen as information processing
system.
Reasoning and thinking corresponds to computation as formal
symbol manipulation.
2 Gives account of the nature and limitations of the computational
power of such a system.
Tarek R. Besold Computational Models of Analogy and Concept Blending
21. Computer Metaphor and Church-Turing Thesis
Famous ideas at the heart of many endeavours in computational
cognitive modelling and/or AI:
1 “Computer metaphor” of the mind (i.e. the concept of a
computational theory of mind).
2 Church-Turing thesis.
1 Bridges gap between humans and computers:
Human mind and brain can be seen as information processing
system.
Reasoning and thinking corresponds to computation as formal
symbol manipulation.
2 Gives account of the nature and limitations of the computational
power of such a system.
Tarek R. Besold Computational Models of Analogy and Concept Blending
22. P-Cognition Thesis
Significant impact on cognitive science and cognitive psychology:
Explain human cognitive capacities modelled in terms of
computational-level theories (i.e., as precise characterisations of
hypothesised inputs and outputs of respective capacities together
with functional mappings between them).
Problem: Computational-level theories often underconstrained
by available empirical data!
) Use mathematical complexity theory as assisting tool:
NP-completeness!
P-Cognition thesis
Human cognitive capacities hypothesised to be of the polynomial-time
computable type.
(Interpretation: “Humans can comfortably solve non-trivial instances of
this problem, where the exact size depends on the problem at hand”.)
Tarek R. Besold Computational Models of Analogy and Concept Blending
23. P-Cognition Thesis
Significant impact on cognitive science and cognitive psychology:
Explain human cognitive capacities modelled in terms of
computational-level theories (i.e., as precise characterisations of
hypothesised inputs and outputs of respective capacities together
with functional mappings between them).
Problem: Computational-level theories often underconstrained
by available empirical data!
) Use mathematical complexity theory as assisting tool:
NP-completeness!
P-Cognition thesis
Human cognitive capacities hypothesised to be of the polynomial-time
computable type.
(Interpretation: “Humans can comfortably solve non-trivial instances of
this problem, where the exact size depends on the problem at hand”.)
Tarek R. Besold Computational Models of Analogy and Concept Blending
24. “polynomial-time computable” = “efficient”?
Humans able to solve problems which may be hard in general but
feasible if certain parameters of the problem restricted.
Parametrised complexity theory: “tractability” captured by
FPT.1
FPT-Cognition thesis (van Rooij, 2008)
Human cognitive capacities hypothesised to be fixed-parameter
tractable for one or more input parameters that are small in practice
(i.e., computational-level theories have to be in FPT).
Tractable AGI thesis (Besold & Robere, 2013)
Models of cognitive capacities in artificial intelligence and computational
cognitive systems have to be fixed-parameter tractable for one or more
input parameters that are small in practice (i.e., have to be in FPT).
1
A problem P is in FPT if P admits an O(f(k)nc
) algorithm, where n is the input
size, k is a parameter of the input constrained to be “small”, c is an independent
constant, and f is some computable function.
Tarek R. Besold Computational Models of Analogy and Concept Blending
25. “polynomial-time computable” = “efficient”?
Humans able to solve problems which may be hard in general but
feasible if certain parameters of the problem restricted.
Parametrised complexity theory: “tractability” captured by
FPT.1
FPT-Cognition thesis (van Rooij, 2008)
Human cognitive capacities hypothesised to be fixed-parameter
tractable for one or more input parameters that are small in practice
(i.e., computational-level theories have to be in FPT).
Tractable AGI thesis (Besold & Robere, 2013)
Models of cognitive capacities in artificial intelligence and computational
cognitive systems have to be fixed-parameter tractable for one or more
input parameters that are small in practice (i.e., have to be in FPT).
1
A problem P is in FPT if P admits an O(f(k)nc
) algorithm, where n is the input
size, k is a parameter of the input constrained to be “small”, c is an independent
constant, and f is some computable function.
Tarek R. Besold Computational Models of Analogy and Concept Blending
26. “polynomial-time computable” = “efficient”?
Humans able to solve problems which may be hard in general but
feasible if certain parameters of the problem restricted.
Parametrised complexity theory: “tractability” captured by
FPT.1
FPT-Cognition thesis (van Rooij, 2008)
Human cognitive capacities hypothesised to be fixed-parameter
tractable for one or more input parameters that are small in practice
(i.e., computational-level theories have to be in FPT).
Tractable AGI thesis (Besold & Robere, 2013)
Models of cognitive capacities in artificial intelligence and computational
cognitive systems have to be fixed-parameter tractable for one or more
input parameters that are small in practice (i.e., have to be in FPT).
1
A problem P is in FPT if P admits an O(f(k)nc
) algorithm, where n is the input
size, k is a parameter of the input constrained to be “small”, c is an independent
constant, and f is some computable function.
Tarek R. Besold Computational Models of Analogy and Concept Blending
27. Complexity of HDTP (1)
HDTP is naturally split into two mechanisms:
Analogical matching of input theories.
Re-representation of input theories by deduction in FOL.
) Re-representation is undecidable (undecidability of FOL).
) Focus on mechanism for analogical matching.
Tarek R. Besold Computational Models of Analogy and Concept Blending
28. Complexity of HDTP (2)
Problem 1. F Anti-Unification
Input: Two terms f,g, and a natural k 2 N
Problem: Is there an anti-unifier h, containing at least k variables, using only
renamings and fixations?
Problem 2. FP Anti-Unification
Input: Two terms f,g, and naturals l,m,p 2 N.
Problem: Is there an anti-unifier h, containing at least l 0-ary variables and at
least m higher arity variables, and two substitutions s,t using only renamings,
fixations, and at most p permutations such that h
s
! f and h
t
! g?
Problem 3. FPA Anti-Unification
Input: Two terms f,g and naturals l,m,p,a 2 N.
Problem: Is there an anti-unifier h, containing at least l 0-ary variables, at least
m higher arity variables, and two substitutions s,t using renamings, fixations,
at most p permutations, and at most a argument insertions such that h
s
! f
and h
t
! g?
Tarek R. Besold Computational Models of Analogy and Concept Blending
29. Complexity of HDTP (3)
...a fair share of formal magic involving a Canadian and some
“Subgraph Isomorphism to Clique” reductions later...
Complexity of HDTP (Higher-Order Anti-Unification)
1 F Anti-Unification is solvable in polynomial time.
2 Let m denote the minimum number of higher arity variables and
let p be the maximum number of permutations applied. Then FP
Anti-Unification is NP-complete and W[1]-hard w.r.t.
parameter set {m,p}.
3 Let r be the maximum arity and s be the maximum number of
subterms of the input terms. Then FP Anti-Unification is in FPT
w.r.t. parameter set {s,r,p}.
4 FPA Anti-Unification is NP-complete and W[1]-hard w.r.t.
parameter set {m,p,a}.
(For proofs: R. Robere and T. R. Besold. Complex Analogies: Remarks on the Complexity of HDTP. In Proceedings of the
25th Australasian Joint Conference on Artificial Intelligence (AI 2012), LNCS 7691. Springer, 2012.)
Tarek R. Besold Computational Models of Analogy and Concept Blending
30. Complexity of HDTP (3)
...a fair share of formal magic involving a Canadian and some
“Subgraph Isomorphism to Clique” reductions later...
Complexity of HDTP (Higher-Order Anti-Unification)
1 F Anti-Unification is solvable in polynomial time.
2 Let m denote the minimum number of higher arity variables and
let p be the maximum number of permutations applied. Then FP
Anti-Unification is NP-complete and W[1]-hard w.r.t.
parameter set {m,p}.
3 Let r be the maximum arity and s be the maximum number of
subterms of the input terms. Then FP Anti-Unification is in FPT
w.r.t. parameter set {s,r,p}.
4 FPA Anti-Unification is NP-complete and W[1]-hard w.r.t.
parameter set {m,p,a}.
(For proofs: R. Robere and T. R. Besold. Complex Analogies: Remarks on the Complexity of HDTP. In Proceedings of the
25th Australasian Joint Conference on Artificial Intelligence (AI 2012), LNCS 7691. Springer, 2012.)
Tarek R. Besold Computational Models of Analogy and Concept Blending
32. Concept blending: A + B = ? (1)
Tarek R. Besold Computational Models of Analogy and Concept Blending
33. Concept blending: A + B = ? (2)
Tarek R. Besold Computational Models of Analogy and Concept Blending
34. Concept blending: A + B = ? (3)
Tarek R. Besold Computational Models of Analogy and Concept Blending
35. Concept blending: A + B = ? (4)
Tarek R. Besold Computational Models of Analogy and Concept Blending
36. Foundations of Theory Blending (1)
Concept Blending
Given two domain theories I1 and I2, representing two
conceptualisations...
...look for a generalisation G...
...construct the blend space B in such a way as to preserve the
correlations between I1 and I2 established by G.
Tarek R. Besold Computational Models of Analogy and Concept Blending
37. Foundations of Theory Blending (2)
Example: Houseboat vs. boathouse
Concept blends of HOUSE and BOAT into BOATHOUSE and
HOUSEBOAT.
I1 = {HOUSE v 8LIVES IN.RESIDENT}
I2 = {BOAT v 8RIDES ON.PASSENGER}
HOUSEBOAT: Aligning parts of the conceptual spaces...
RESIDENT $ PASSENGER
LIVES IN $ RIDES ON
HOUSE $ BOAT
BOATHOUSE: Aligning parts of the conceptual spaces...
RESIDENT $ BOAT
Tarek R. Besold Computational Models of Analogy and Concept Blending
38. The Concept Invention Theory (COINVENT) Project (1)
To develop a novel, computationally feasible, formal model of
conceptual blending based on Fauconnier and Turner’s theory.
To gain a deeper understanding of conceptual blending and
its role in computational creativity.
To design a generic, creative computational system capable of
serendipitous invention and manipulation of novel abstract
concepts.
To validate our model and its computational realisation in two
representative working domains: mathematics and music.
Tarek R. Besold Computational Models of Analogy and Concept Blending
39. The Concept Invention Theory (COINVENT) Project (2)
Tarek R. Besold Computational Models of Analogy and Concept Blending
40. Amalgamation 101 (1)
I1 I2
¯I2
¯I1
G = I1 u I2
A = ¯I1 t ¯I2
v
v
v
vvv
v v
Amalgam
A description A 2 L is an amalgam of two inputs I1 and I2 (with
anti-unification G = I1 uI2) if there exist two generalisations ¯I1 and ¯I2
such that (1) G v¯I1 v I1, (2) G v¯I2 v I2, and (3) A =¯I1 t¯I2
Tarek R. Besold Computational Models of Analogy and Concept Blending
41. Amalgamation 101 (2)
v
vv
v
v v
A = S t T
S
S
T
G = S u T
Asymmetric Amalgam
An asymmetric amalgam A 2 L of two inputs S (source) and T (target)
satisfies that A = S0 tT for some generalisation of the source S0 v S.
Tarek R. Besold Computational Models of Analogy and Concept Blending
42. COINVENT’s Blending Schema
1.) Compute shared generalisation G from S and T with fS(G) = Sc.
2.) Re-use fS in generalisation of S into S0.
3.) Combine S0 in asymmetric amalgam with T into proto-blend
T0 = S0 tT.
4.) By application of fT , complete T0 into blended output theory TB.
(✓: element-wise subset relationship between sets of axioms. v: subsumption
between theories in direction of respective arrows.)
Tarek R. Besold Computational Models of Analogy and Concept Blending
43. ...and the Implementation?
Use HDTP for computation of generalisation(s) and
substitution chains/higher-order anti-unifications.
Currently: Restrict HDTP to using only renamings and
fixations.
) Possibility to use “classical” semantic consequence |= as
ordering relationship.
(Also preserved by later unifications and addition of axioms.)
Use HDTP’s heuristics for selecting least general
generalisation G (among several options).
Currently: Naive consistency/inconsistency check with final
blend (both internally and against world knowledge).
) Clash resolution by re-start with reduced set of input
axioms.
Tarek R. Besold Computational Models of Analogy and Concept Blending
44. ...and the Implementation?
Use HDTP for computation of generalisation(s) and
substitution chains/higher-order anti-unifications.
Currently: Restrict HDTP to using only renamings and
fixations.
) Possibility to use “classical” semantic consequence |= as
ordering relationship.
(Also preserved by later unifications and addition of axioms.)
Use HDTP’s heuristics for selecting least general
generalisation G (among several options).
Currently: Naive consistency/inconsistency check with final
blend (both internally and against world knowledge).
) Clash resolution by re-start with reduced set of input
axioms.
Tarek R. Besold Computational Models of Analogy and Concept Blending
45. Example: Brillo, the Foldable Toothbrush
Tarek R. Besold Computational Models of Analogy and Concept Blending
81. (Definitely Not) The End!
If you are interested in non-classical reasoning, tractability,
approximability and similar topics in A(G)I and/or cognitive science,
you are happily invited to...
1 ...talk to me after the presentation.
2 ...get in touch by e-mail:
TarekRichard.Besold@unibz.it.
3 ...occasionally have a look at our publications.2
2
For instance:
Besold, T. R., and Robere, R.. When Thinking Never Comes to a Halt: Using Formal
Methods in Making Sure Your AI Gets the Job Done Good Enough. In
V. C. M¨uller (ed.), Fundamental Issues of Artificial Intelligence (Synthese Library, vol.
376). Springer, 2016.
Besold, T. R., and Plaza, E. Generalize and Blend: Concept Blending Based on
Generalization, Analogy, and Amalgams. In H. Toivonen, S. Colton, M. Cook, and D.
Ventura, Proceedings of the Sixth International Conference on Computational
Creativity (ICCC) 2015. Brigham Young University Press, 2015.
Tarek R. Besold Computational Models of Analogy and Concept Blending
83. Disclaimer
Frequent criticism:
Demanding for cognitive systems and models to work within
certain complexity limits overly restrictive.
Maybe: Human mental activities actually performed as
exponential-time procedures, but never noticed as exponent for
some reason always very small.
Reply:
Possibility currently cannot be excluded.
Instead: No claim that cognitive processes without exception
within FPT, APX, FPA, or what-have-you...
...but staying within boundaries makes cognitive systems and
models plausible candidates for application in resource-bounded
general-purpose cognitive agents.
Tarek R. Besold Computational Models of Analogy and Concept Blending
84. Disclaimer
Frequent criticism:
Demanding for cognitive systems and models to work within
certain complexity limits overly restrictive.
Maybe: Human mental activities actually performed as
exponential-time procedures, but never noticed as exponent for
some reason always very small.
Reply:
Possibility currently cannot be excluded.
Instead: No claim that cognitive processes without exception
within FPT, APX, FPA, or what-have-you...
...but staying within boundaries makes cognitive systems and
models plausible candidates for application in resource-bounded
general-purpose cognitive agents.
Tarek R. Besold Computational Models of Analogy and Concept Blending
85. Complexity of the First HDTP Generation
As an aside:
Once upon a time, there was HDTP-old based on reducing certain
higher-order to first-order anti-unifications by introduction of subterms
built from “admissible sequences” over equational theories (i.e.,
conjunctions of FOL formulae with equality over a term algebra).
Complexity of HDTP-old
1 HDTP-old is NP-complete.
2 HDTP-old is W[2]-hard with respect to a minimal bound on
the cardinality of the set of all subterms of the term against
which admissibility is checked.
(For proofs: R. Robere and T. R. Besold. Complex Analogies: Remarks on the Complexity of HDTP. In Proceedings of the
25th Australasian Joint Conference on Artificial Intelligence (AI 2012), LNCS 7691. Springer, 2012.)
Tarek R. Besold Computational Models of Analogy and Concept Blending
86. Complexity of the First HDTP Generation
As an aside:
Once upon a time, there was HDTP-old based on reducing certain
higher-order to first-order anti-unifications by introduction of subterms
built from “admissible sequences” over equational theories (i.e.,
conjunctions of FOL formulae with equality over a term algebra).
Complexity of HDTP-old
1 HDTP-old is NP-complete.
2 HDTP-old is W[2]-hard with respect to a minimal bound on
the cardinality of the set of all subterms of the term against
which admissibility is checked.
(For proofs: R. Robere and T. R. Besold. Complex Analogies: Remarks on the Complexity of HDTP. In Proceedings of the
25th Australasian Joint Conference on Artificial Intelligence (AI 2012), LNCS 7691. Springer, 2012.)
Tarek R. Besold Computational Models of Analogy and Concept Blending
87. A Primer on Approximation Theory
Approximability Classes
In the following, let...
...PTAS denote the class of all NP optimisation problems that
admit a polynomial-time approximation scheme.
...APX be the class of NP optimisation problems allowing for
constant-factor approximation algorithms.
...APX-poly be the class of NP optimisation problems allowing for
polynomial-factor approximation algorithms.
Please note that PTAS ✓ APX ✓ APX-poly (with each inclusion being
proper in case P 6= NP).
Tarek R. Besold Computational Models of Analogy and Concept Blending
88. Approximability Analysis of HDTP (1)
FP Anti-Unification W[1]-hard to compute for parameter set m,p
(m number of higher-arity variables, p number of permutations).
) No polynomial-time algorithm computing “sufficiently complex”
generalisations (i.e., with lower bound on number of higher-arity
variables), upper bounding number of permutations
(W[1]-hardness for single permutation).
What if one considers generalisations which merely
approximate the “optimal” generalisation in some sense?
Tarek R. Besold Computational Models of Analogy and Concept Blending
89. Approximability Analysis of HDTP (2)
Complexity of a Substitution
The complexity of a basic substitution s is defined as
C(s) =
8
><
>:
0, if s is a renaming.
1, if s is a fixation or permutation.
k +1, if s is a k-ary argument insertion.
The complexity of a restricted substitution s = s1 ··· sn (i.e., the
composition of any sequence of unit substitutions) is the sum of the
composed substitutions: C(s) = Ân
i=1 C(si ).
Tarek R. Besold Computational Models of Analogy and Concept Blending
90. Approximability Analysis of HDTP (3)
Consider problem of finding generalisation which maximises
complexity over all generalisations:
Complex generalisation would contain “most information” present
over all of the generalisations chosen (i.e., maximising the
“information load”).
Using approximability results on MAXCLIQUE:
Approximation Complexity of HDTP Analogy-Making
FP anti-unification is not in APX (i.e., does not allow for
constant-factor approximation algorithms) and is hard for APX-poly.
(For proofs: T. R. Besold and R. Robere. When Almost Is Not Even Close: Remarks on the Approximability of HDTP. In
Artificial General Intelligence - 6th International Conference (AGI 2013), LNCS. Springer, 2013.)
Tarek R. Besold Computational Models of Analogy and Concept Blending