Advertisement
Advertisement

More Related Content

Similar to Cognitive Agents with Commonsense - Invited Talk at Istituto Italiano di Tecnologia (IIT), I-Cog Initiative(20)

Advertisement
Advertisement

Cognitive Agents with Commonsense - Invited Talk at Istituto Italiano di Tecnologia (IIT), I-Cog Initiative

  1. Cognitive Agents with Commonsense Antonio Lieto Università di Torino, Dipartimento di Informatica, IT ICAR-CNR, Palermo, IT February 18 2021, iCog seminars, Istituto Italiano di Tecnologia (IIT)
  2. Outline – Knowledge representation and processing in CAs: Open problems – Current Solutions (and their problems): Extended Declarative Memories – More Constrained Knowledge Processing Models – A Case Study on Linguistic Categorization: DUAL-PECCS
  3. Preamble – Cognitivist Cognitive Architectures are assumed to be well- equipped in dealing with aspects concerning knowledge processing and high-level cognition with respect to the emergentist/ developmental ones. – Unfortunately there are some problems that limit their role in a computationally grounded science of the mind.
  4. Which role? 4 Inspiration
  5. Which role? 5 Inspiration Explanation
  6. Knowledge Level Analysis Knowledge Level (Newell, 1982; 1990) = level of analysis and prediction of the rational behavior of a cognitive agent (based on the assumed availability of the agent knowledge, in order to pursue its own goals and related actions). Can we use the models built in Cognitive Architectures as a computational proxy of the human knowledge processing capabilities?
  7. Current Problems at the “Knowledge Level” CAs are general structures without a corresponding “general” content (SIZE PROBLEM). Ad hoc/task specific built knowledge. The knowledge represented and manipulated by such CAs is usually homogeneous in nature (HOMOGENEITY PROBLEM) Lieto, A., Lebiere, C., & Oltramari, A. (2018). The knowledge level in cognitive architectures: Current limitations and possible developments. Cognitive Systems Research, 48, 39-55.
  8. SIZE problem Conceptual knowledge in humans is a huge, variegated and multi- domain. To test the architectural mechanisms of memory storage, retrival, reasoning we should endow our agent with a human-level knowledge (=> one of Newell’s criteria for a theory of cognition). Why? Having a system with huge knowledge poses immediately computational and cognitive problems concerning the retrieval of the correct knowledge given a task to solve that are neglected or hidden under the carpet with toy-knowledge bases.
  9. Solutions: Extended Declarative Memories - Soar terms connected to the linguistic resource WordNet but: only some taxonomical relations between terms (Derbinsky et al., 2010)
  10. Solutions: Extended Declarative Memories - Such solutions are all available in ACT-R Ball et al. 2008
  11. Solutions: Extended Declarative Memories - Such solutions are all available in ACT-R Ball et al. 2008 Salvucci et al. 2014 (DbPedia)
  12. Problems - All such solutions extends Declarative Memories with symbolic/ ontological semantic representations - However symbol-like representations encounters problems in dealing with common-sense knowledge representation and reasoning (e.g. approximate reasoning is computationally hard in graph-like structures). (HOMOGENEITY PROBLEM)
  13. (lack of) HETEROGENITY problem Classical vs Commonsense knowledge Knowledge represented and manipulated by such CAs mainly the so called “classical” part of conceptual information (that one representing concepts in terms of necessary and sufficient conditions). The so called “common-sense” conceptual components of our knowledge is largely absent in such computational frameworks.
  14. Classical Theory – Ex. 22 TRIANGLE = Polygon with 3 corners and sides PROBLEM: Common-sense concepts cannot be defined in this way. There are many theories developed in cognitive science trying to provide an explanation to the problem to typicality
  15. …. AI and CogSci approaches to Commonsense reasoning (partial overview) Semantic Networks (Collins and Quillians, 1969) Classical Theory Prototype Theory Rosch (1975) Frames (Minsky, 1975) Scripts (Shank & Abelson, 1977) Circumscription (Mc Carthy, 1980) Exemplar Theory Medin and Schaffer (1978)
  16. 16 Commonsense knowledge as grounding element of layers of growing thinking capabilities
  17. 17 Commonsense knowledge as grounding element of layers of growing thinking capabilities Commonsense knowledge and reasoning capabilities
  18. Commonsense reasoning Concerns all the type of non deductive (or non monotonic) inference: - induction - abduction - default reasoning - … 18
  19. Commonsense reasoning Concerns all the type of non deductive (or non monotònic) inference: - induction - abduction - default reasoning - … 19 TIPICALITY
  20. Prototypes and Prototypical Reasoning • Categories based on prototypes (Rosh,1975) • New items are compared to the prototype atypical typical P
  21. Ad-hoc Solutions Use ontologies as frame structures (Misky) or with “commonsense rules” able to perform some commonsense inferences
  22. Ad-hoc Solutions Use ontologies as frame structures (à la Minsky) or with “commonsense rules” able to perform some commonsense inferences BIRD ⊑ FLY
  23. Ad-hoc Solutions Use ontologies as frame structures (à la Minsky) or with “commonsense rules” able to perform some commonsense inferences BIRD ⊑ FLY IF X {Wag Tails, Barks, hasFur}
  24. Ad-hoc Solutions Use ontologies as frame structures (à la Minsky) or with “commonsense rules” able to perform some commonsense inferences BIRD ⊑ FLY IF X {Wag Tails, Barks, hasFur}
  25. Problems This knowledge engineering approach works for well-defined narrow domains but it is does not scale and is not generalizable.
  26. Problems This knowledge engineering approach works for well-defined narrow domains but it is does not scale and is not generalizable. Why? Prototypes and Commonsense knowledge dynamic and context dependent.
  27. Problems typical
  28. Problems typical
  29. Exemplars and Exemplar-based Reasoning • Categories as composed by a list of exemplars. New percepts are compared to known exemplars (not to Prototypes).
  30. Conflicting Theories? • Exemplars theory overcomes the Prototypes (it can explain so called OLD ITEM EFFECT). • Still in some situations prototypes are preferred in categorization tasks. 30
  31. Conflicting Theories? • Exemplars theory overcomes the Prototypes (it can explain so called OLD ITEM EFFECT). • Still in some situations prototypes are preferred in categorization tasks. Prototypes, Exemplars and other conceptual representations (for the same concept) can co-exists and be activated in different contexts (Malt 1989). 31
  32. Type 1/Type 2 features 32 ACT-R (Anderson et al. 2004) CLARION (Sun, 2006) Vector-LIDA (Franklin et al. 2014) SOAR (Laird 2012) Concepts as chunks (symbolic structures) Neural networks + Symbol Like representations High dimensional vector spaces Concepts as chunks (symbolic structures) Sub-symbolic and Bayesian activation of chunks Subsymbolic activation of conceptual chunks Similarity based vectorial activation Rule-based activation and firing of chunks Prototypes and Exemplars models of categorisation available in separation Prototypes and Exemplars models of categorisation NOT available Prototypes and Exemplars models of categorisation NOT available Prototypes and Exemplars models of categorisation NOT available Extended Declarative Memory CYC, DBPedia) Ad hoc or narrow Knowledge Ad hoc or narrow Knowledge Extended Semantic Memory with linguistic resources (ex. Wordnet)
  33. DUAL PECCS: DUAL- Prototype and Exemplars Conceptual Categorization System Lieto, Radicioni, Rho (IJCAI 2015, JETAI 2017)
  34. 34 1) Multiple representations for the same concept 2) On such diverse, but connected, representation are executed different types of reasoning (System 1/ System 2) to integrate. 2 Cognitive Assumptions Type 1 Processes Type 2 Processes Automatic Controllable Parallel, Fast Sequential, Slow Pragmatic/contextualized … Logical/Abstract …
  35. Heterogeneous Proxytypes Hypothesis The different proposals that have been advanced can be grouped in three main classes: a) fuzzy approaches, b) probabilistic and Bayesan approaches, c) approaches based on non-monotonic formalisms. TIPICALITY The diverse types of connected representations can coexist and point to the same conceptual entity. Each representation can be activated as a proxy (for the entire concept) from the long term memory to the working memory of a cognitive agent. (Lieto, A. A Computational Framework for Concept Representation in Cognitive Systems and Architectures: Concepts as Heterogeneous Proxytypes, Proc. of BICA 2014) CLASSICAL
  36. Ex. Heterogeneous Proxytypes at work The different proposals that have been advanced can be grouped in three main classes: a) fuzzy approaches, b) probabilistic and Bayesan approaches, c) approaches based on non-monotonic formalisms. (Lieto, A. A Computational Framework for Concept Representation in Cognitive Systems and Architectures: Concepts as Heterogeneous Proxytypes, Proc. of BICA 2014)
  37. Heterogeneous Proxytypes in DUAL-PECCS 37 dopting differ- mbolic perspec- oded in terms orks [Quillian, prototypes can convex region mbolic perspec- concept can, on atterns of con- Ns). Similarly, both symbolic sed, as well as emplars can be mbolic systems, or as a partic- inally, also for t in principle–, ver, this seems evels are more nceptual repre- tificial systems is-a: feline color: yellow hasPart: fur hasPart: tail hasPart: stripes ... conceptual space representation concept Tiger Kingdom: Animalia Class: Mammalia Order: Carnivora Genus: Panthera Species: P. tigris prototype of Tiger exemplars of Tiger white-tiger is-a: feline color: white hasPart: fur hasPart: tail hasPart: stripes ... ... ontological representation classical information Typicality-based knowledge Classical knowledge Hybrid Knowledge Base Figure 1: Heterogeneous representation of the tiger concept our system includes two main sorts of components, based on Lieto, A., Radicioni, D., Rho, V, (2017). Dual PECCS: a cognitive system for conceptual representation and categorization, JETAI, 29 (2), 433-452, Taylor and Francis. Lieto et al. (2015), A Common-Sense Conceptual Categorization System Integrating Heterogeneous Proxytypes and the Dual Process of Reasoning, IJCAI, AAAI Press.
  38. 38 ng differ- perspec- in terms Quillian, types can ex region perspec- pt can, on s of con- Similarly, symbolic s well as rs can be systems, a partic- also for inciple–, is seems are more al repre- systems is-a: feline color: yellow hasPart: fur hasPart: tail hasPart: stripes ... conceptual space representation concept Tiger Kingdom: Animalia Class: Mammalia Order: Carnivora Genus: Panthera Species: P. tigris prototype of Tiger exemplars of Tiger white-tiger is-a: feline color: white hasPart: fur hasPart: tail hasPart: stripes ... ... ontological representation classical information Typicality-based knowledge Classical knowledge Hybrid Knowledge Base Figure 1: Heterogeneous representation of the tiger concept our system includes two main sorts of components, based on Co-referring representational Structures via Wordnet Lieto, A., Mensa, E,, Radicioni, D., 2016. A resource-driven approach for anchoring linguistic resources conceptual spaces. In Conference of the Italian Association for Artificial Intelligence (pp. 435-449). Springer, Cham.
  39. S1/S2 Categorization Algorithms 39
  40. Overview NL Description -The big fish eating plankton Typical Representations IE step and mapping List of Concepts : -Whale 0.1 -Shark 0.5 -… Output S1 (Prototype or Exemplar) Check on S2 Ontological Repr. -Whale NOT Fish -Whale Shark OK Output S2 (CYC) Output S1 + S2 Whale Whale Shark
  41. ACT-R Integration • “Extended” Declarative Memory of ACT-R • Integration of the dual process base categorisation processes in ACT-R 41 for a given concept can be represented by adopting differ- ent computational frameworks: i) from a symbolic perspec- tive, prototypical representations can be encoded in terms of frames [Minsky, 1975] or semantic networks [Quillian, 1968]; ii) from a conceptual space perspective, prototypes can be geometrically represented as centroids of a convex region (more on this aspect later); iii) from a sub-symbolic perspec- tive, the prototypical knowledge concerning a concept can, on the other hand, be represented as reinforced patterns of con- nections in Artificial Neural Networks (ANNs). Similarly, for the exemplars-based body of knowledge, both symbolic and conceptual space representations can be used, as well as the sub-symbolic paradigm. In particular, exemplars can be represented as instances of a concept in symbolic systems, as points in a geometrical conceptual space, or as a partic- ular (local) pattern of activation in a ANN. Finally, also for the classical body of knowledge it is –at least in principle–, is-a: feline color: yellow hasPart: fur hasPart: tail hasPart: stripes ... conceptual space representation concept Tiger Kingdom: Animalia Class: Mammalia Order: Carnivora Genus: Panthera Species: P. tigris prototype of Tiger exemplars of Tiger white-tiger is-a: feline color: white hasPart: fur hasPart: tail hasPart: stripes ... ... ontological representation classical information Typicality-based knowledge Classical knowledge Hybrid Knowledge Base ACT-R concepts represented as en “empty chunk” (chunk having no associated information, except for its WordNet synset ID and a human readable name), referred to by the external bodies of knowledge (prototypes and exemplars) acting like semantic pointers.
  42. CLARION Integration • “Extende 42 for a given concept can be represented by adopting differ- ent computational frameworks: i) from a symbolic perspec- tive, prototypical representations can be encoded in terms of frames [Minsky, 1975] or semantic networks [Quillian, 1968]; ii) from a conceptual space perspective, prototypes can be geometrically represented as centroids of a convex region (more on this aspect later); iii) from a sub-symbolic perspec- tive, the prototypical knowledge concerning a concept can, on the other hand, be represented as reinforced patterns of con- nections in Artificial Neural Networks (ANNs). Similarly, for the exemplars-based body of knowledge, both symbolic and conceptual space representations can be used, as well as the sub-symbolic paradigm. In particular, exemplars can be represented as instances of a concept in symbolic systems, as points in a geometrical conceptual space, or as a partic- ular (local) pattern of activation in a ANN. Finally, also for the classical body of knowledge it is –at least in principle–, is-a: feline color: yellow hasPart: fur hasPart: tail hasPart: stripes ... conceptual space representation concept Tiger Kingdom: Animalia Class: Mammalia Order: Carnivora Genus: Panthera Species: P. tigris prototype of Tiger exemplars of Tiger white-tiger is-a: feline color: white hasPart: fur hasPart: tail hasPart: stripes ... ... ontological representation classical information Typicality-based knowledge Classical knowledge Hybrid Knowledge Base • natively “dual process” • Typicality information (conceptual space —> implicit NACS layer • Classical (ontology)—> explicit NACS The mapping between the sub-symbolic module of CLARION and the vector-based representations of the Conceptual Spaces has been favored, since such architecture also synthesizes the implicit information in terms of dimensions-values pairs
  43. ACT-R, SOAR, CLARION and LIDA Extended Declarative Memories with DUAL-PECCS Salvucci et al. 2014 (DbPedia)
  44. DEMO https://www.youtube.com/watch?v=1KtnAWyxj-8 44
  45. http://dualpeccs.di.unito.it
  46. Evaluation The different proposals that have been advanced can be grouped in three main classes: a) fuzzy approaches, b) probabilistic and Bayesan approaches, c) approaches based on non-monotonic formalisms. 112 common sense linguistic descriptions provided by a team of linguists, philosophers and neuroscientists interested in the neural basis of lexical processing (FMRI). Gold standard: for each description recorded the human answers for the categorization task. Stimulus Expected Concept Expected Proxy- Representation Type of Proxy- Representation … … … … The primate with red nose Monkey Mandrill EX The feline with black fur that hunts mice Cat Black cat EX The big feline with yellow fur Tiger Prototypical Tiger PR
  47. 47 • Two evaluation metrics have been devised: - Concept Categorization Accuracy: estimating how often the correct concept has been retrieved; - Proxyfication Accuracy: how often the correct concept has been retrieved AND the expected representation has been retrieved, as well. Accuracy Metrics
  48. 48 • Three sorts of proxyfication errors were committed: - Ex-Proto, an exemplar is returned in place of a prototype; - Proto-Ex, we expected a prototype, but a prototype is returned; - Ex-Ex, an exemplar is returned differing from the expected one. • Three sorts of proxyfication errors were committed: - Ex-Proto, an exemplar is returned in place of a prototype; - Proto-Ex, we expected a prototype, but a prototype is returned; - Ex-Ex, an exemplar is returned differing from the expected one. Proxyfication Error
  49. Analysis The different proposals that have been advanced can be grouped in three main classes: a) fuzzy approaches, b) probabilistic and Bayesan approaches, c) approaches based on non-monotonic formalisms. - The comparison of the obtained results with human categorization is encouraging 77-89% (results of other AI systems for such reasoning tasks are by far lower). - The analysis of the results revealed that it is not true that exemplars (if similar enough to the stimulus to categorise) are always preferred w.r.t. the prototypes. - Need of a more fine-grained theory explaining more in the details the interaction between co-existing representations in the heterogeneous hypothesis.
  50. Upshots and Future direction The different proposals that have been advanced can be grouped in three main classes: a) fuzzy approaches, b) probabilistic and Bayesan approaches, c) approaches based on non-monotonic formalisms. Cognitive architectures should be endowed with more constrained knowledge processing mechanisms to test their representational and reasoning assumptions (commonsense as crucial component). Commonsense could be the “bridge” between perception and cognition. Need to find non ad-hoc integration solutions. The mechanisms showed could influence other components (e.g. episodic memory & exemplars; affordances & prototypes) in an integrated architecture.
  51. Cognitive Design for Artificial Minds 51 Forthcoming in 2021 !! Taylor and Francis Forthcoming in April 2021 !! Taylor and Francis
Advertisement